Live Park (Hack GT)

Live Park is a custom wireless IoT parking sensor and management system developed for Hack GT 2017 (largest student Hackathon in the South East). My team won the GM Prize for “Best Use of Vehicle APIs”. Our Devpost entry can be found here.

This slideshow requires JavaScript.


Parking in Atlanta sucks, and knowledge is power.

What it does

At it’s heart, Park Live is a collection of easy to consume APIs for collecting and consuming data about parked cars. While this may sound obvious and limited and scope, we believe it to be quite the opposite.

The immediate value should be readily apparent: optimizing basic parking in decks and lots for both city planners and consumers alike. With knowledge of which spots are full, consumers are able to find open areas more quickly and decks can simply close when they are full. While this in and itself is far from unique, we definitely believe in the value of the concept, and support it both with our own hardware and the potential for software integration.

We have built a proof of concept “Doot” – a wireless, battery powered parking spot monitor that interfaces with the growing API platform offered by Live Park. Doots should be rugged, reliable, and long lasting, all of which are traits we believe we have captured in our prototype model.

The Live Park central server is responsible for all aggregating the IoT level data provided out Doots and other sensors and surfacing easy to use APIs. It also acts as the gateway for interactions with the growing community of “Smart City” services and analytics.

The Live Park mobile application enables consumers to tap in to this wealth of information in the way most convenient to them. They can easily find Live Park enabled lots near them or their destinations, book a spot, and maybe one day, even pay for their parking, all while helping optimize the system as a whole.

Finally, the Live Park web app enables the administrators of parking assets to view the live status of their entire lots/decks. Currently, this web app is part of the central server, but could de deployed separately with relatively little work if called for.

With a single comprehensive platform in place, it becomes easy to see how one might expand into additional markets. For example, we have already seen the impact of connecting directly with cars through GM’s remote and in-vehicle APIs. Parking instructions can be transmitted directly to in-vehicle navigation, identity and payment validation can be handled automatically, and electric vehicles can automatically be given priority parking where charging is available.

Now that the flowery pitch is done with, lets get in to some details.

How we built it

Our prototype “Doot” runs on an Arduino Pro Mini, communicates wirelessly on 433 MHz, and detects cars using an infrared proximity sensor. It is battery powered and enclosed in a previously weather proof box.

The Live Park “central server” is written in GoLang. This choice was admittedly due to the team’s familiarity with it. Even though it is a slower choice for rapid prototyping, it is one of the more performant languages in use, and using it forces us to be in the mindset of truly scalable production code. The central server currently communicates with the Doot base station (receiver) over serial, though this could change easily if the receiver was made to connect to the internet.

The consumer app was developed for Android in Android Studio.

The Web App is written primarily in JavaScript and makes heavy use of the Google Maps JS APIs.

Challenges we ran into

We initially had a very, very hard time getting Android Studio up and running. None of us had ever written a mobile application before, and failing our way through the learning process took some time. While we did end up learning a lot, it felt like a lot of time was wasted to strange versioning issues that no one could quite explain to us.

The plan had been to build a pair of Doots to better demonstrate the technology, but apparently all but one of the 433 MHz Transceivers were “snagged” before parts checkout even began. This meant we had to spend some extra cycles figuring out how best to demo our single sensor, though I am sure it saved us time on construction and assembly.

We also had some issues reading data from our proximity sensors for a while, but it turned out just to be noise on the line due to a faulty connection. This really highlights what we found to be one of the hardest parts of embedded systems development: you can’t always count on the hardware to work when trying to debug your software.

Accomplishments that we’re proud of

We put serious effort in to making our final proof of concept hardware as robust as possible. We are extremely proud of how reliable our device is, how professional it looks, and how solid it feels. We have all worked on some brittle hardware hacks in the past, and decided as a team to do things right this time.

We are also proud of the performance we were able to eek out of the hardware we had access to. Once we got the MVP up and running, we just kept optimizing until we were happy. As one example, one of our team members ran the numbers and built antennas for our generic transmitter/receiver combo and literally tripled our range.

Finally, we are definitely pleased with how our mobile app turned out. It isn’t anything fancy, but it is far, far better than we were hoping given the complete lack of app development experience on our team. Which leads quite nicely in to the next section…

What we learned

All of the team members are big fans of Google, but I think we all learned a lot about their ecosystem. This was the first time developing a mobile app for everyone on the team, and we dug pretty deep into the Google Maps API as well. I know Jared is excited to continue exploring mobile software development in the future.

What’s next for Live Park

The most obvious first step for Live Park would be to acquire more specialized hardware for building our Doots. We only use a tiny fraction of the bandwidth available on the radios provided at the Hackathon, and the proximity sensors require a break in the face of the otherwise water-proof casing. Different radio technology would use far less power while still having much better range, and anisotropic magnetoresistance sensors would let us detect cars based on their magnetic field alone. Additionally, burning all of the application logic and components onto a single SoC board would greatly reduce the size of the final product.

Another obvious fast follow would be an iOS brother to our Android App. And shortly after, CarPlay and Android Auto versions as well.

Finally, the strategic play would arguably be the most important. Finding the right deals and partnerships (think Park Mobile) would help our technology get adopted as quickly as possible.

Leave a Reply