Avatar of Erik Romijn posted April 03 2013
on Erik Romijn

Open for Business 4: Technical Workshop with Big Nerd Ranch

This is the fourth in a series of posts about Apps for Amsterdam: Open for Business, an initiative by Appsterdam, Amsterdam Economic Board and Waag Society to work with three local start-ups to support them in making successful businesses using open data. I’m participating with Bike Like a Local. As I started this series a bit late, over the next week I’ll be posting about events so far.

On March 16, we had a technical workshop with Bolot Kerimbaev from Big Nerd Ranch. Big Nerd Ranch is a big name especially in mobile development, known particularly for their excellent development courses. This was a group session, so all teams could also learn from each other.

Accelerometers and gyroscopes

Being a developer myself, I can figure out many problems on my own. I had left some of the most difficult for this session. My biggest challenge is the use of the accelerometer and gyroscope. I wanted to use these for two purposes: to track the stability of the cyclist, and therefore their skill; and to know when they’ve stopped cycling and gotten off their bike. I want to use the latter to help them remember where they left their bike, and to warn them if they park in one of the strict parking enforcement areas.

Accelerometers and gyroscopes are very neat sensors, but quite difficult to use. An accelerometer measures the acceleration, like speeding up or slowing down, in three different axes. The gyroscope measures the rate of rotation around each of these axes. Now, getting some numbers out of accelerometer and gyroscope is trivial. Translating between the movement a device makes, and how that might appear in sensor data, is already a bit challenging. But the real difficulty is in filtering out the signal you’re looking for. This is intense maths, in which I have almost no experience.

I thought I’d start with actually looking at the sensor data in known situations, and see whether I could manually recognise particular movements in the data. I installed Apple’s MotionGraphs utility, which takes the data from all sensors and plots it on a simple graph. I then tried that in accelerating trams. Trams accelerate quite violently, so that ought to be easy to see. The result was this:

If you look very carefully, you can see a small deviation in the blue line, for the z-axis (the green line’s -1 is gravity). That’s the acceleration of a tram - which is much more regular than that of a bike. However, it’s virtually indistinguishable from other noise, even though I kept my phone very steady. In real life situations, there would be much more noise. In other words, it’s going to be incredibly difficult to make this work for my plans.

Alternative solutions

This being my most difficult problem, I showed it to Bolot, who agreed that it would be very difficult to distill the data I wanted from the sensor inputs.

We thought the process through a bit more. Another way of knowing a user got off their bike is that they stop for a while, then resume walking, at a slower speed. That could be detected with GPS. However, the user might already have stopped the app by the time they start walking… and then we realised… the user might already have stopped the app. I don’t need advanced sensors to find out when a tourist stops cycling: they will take their phone out of their pocket, to stop the app. So as soon as they take their phone out, I offer them to stop the trip. As they’re already looking at the screen, that is also the perfect moment to warn them if they are in a strict parking enforcement area.

This does not work for determining the stability of someone’s cycling. But, I wasn’t too keen on that anymore, because I also lack a plan of what to do when someone is a bad cyclist. So I’ll drop that for now.

In other words, I learned at the technical workshop that I had already solved my most complex technical problem in an incredibly simple way, without realising it.