Smart Brake Light Proof of Concept

by alinke in Circuits > LEDs

31165 Views, 162 Favorites, 0 Comments

Smart Brake Light Proof of Concept

brake-thumb.png
DSC02185.JPG
Smart Brake Light Proof of Concept

Updated January 2019

Disclaimer: Rear facing LED displays are not legal in the United States and thus this project is intended as a proof of concept only.

I had the honor of an invite to a hack-a-thon event put together by Boing Boing and Ford. The theme of the hack-a-thon was applications using real time driving data. This Instructable is about my hack-a-thon project.


I was curious to hear what Ford would be bringing to the table. It turns out Detroit based Ford has a research lab in Palo Alto who have created an open source real time data platform called OpenXC. At the moment, OpenXC supports Android and Python.

In my view, OpenXC represents a significant milestone in Maker history as Ford is the first car manufacturer to make real time vehicle data available to consumers. At the moment, just Ford 2010 and newer vehicles are supported but Ford has made the platform open so there is nothing stopping other car manufacturers from supporting the platform. Let’s hope more jump in!

Modern vehicles have a dedicated network called a CAN bus. The CAN bus is one of the primary components of OBD-II, a vehicle diagnostic standard mandatory for all cars sold in the United States since 1996. You'll always find the ODB-II port by the driver's knees. Your mechanic will use a handheld ODB scanner to read vehicle diagnostic codes or clear the check engine light. Using the ODB-II port with a piece of hardware OpenXC calls the “Vehicle Interface”, one can listen to the CAN bus and capture desired data into an Android app.

If you’re like me and don’t own a Ford, the good news is you can still create a cool app using a simulator containing real driving data. Ford has written an Android app called the “Enabler” which allows you to load trace files filled with driving data. Running as a service on your Android device, the Enabler streams driving data to your app. Whether this data is coming from a trace file or real time via Ford’s “Vehicle Interface” device over USB or Bluetooth, your app doesn’t know. Based on the ChipKit32 board, the “Vehicle Interface” hardware is open source as well. You’ll find the parts and instructions to build your own Vehicle Interface on the OpenXC site at a cost of around $110. At the time of this writing, there is also a pre-made Vehicle Interface you can buy from a company called Cross Chasm. Note that each car model requires a unique firmware. To obtain the firmware specific to your Ford model, you’ll need to first register at developer.ford.com and then will find the firmware downloads.

Note that you can buy inexpensive ODB-II Bluetooth dongles and use them with smart phone apps today like Torque which begs the question, why OpenXC? The reasoning is that OpenXC provides access to more sensors than are exposed through standard ODB-II. Ford has a detailed explanation on the data set differences between OpenXC and ODB-II for those interested. In addition at least from my research, I wasn't able to find any libraries available to developers for developing custom applications using the inexpensive ODB-II Bluetooth dongles.

The OpenXC documentation is quite good, you’ll find all the supported data signals on the OpenXC site. As you’ll see, the OpenXC data set is pretty broad so it’s really up to your imagination on what you can create.

My idea was to utilize the driving data for a smart brake light proof of concept. Smart meaning a rear display that can do more than just turn on and off when braking. For example, if the driver hard brakes, let’s show an urgent symbol letting the driver behind you know to slow down quickly. If someone was kind enough to let you merge in, we can speak a voice command to display a thank you message.

I also added a multi-color bar graph relative to the gas accelerator position which serves no useful purpose but looks pretty cool.

One very handy piece of data is trip fuel consumed which tells you precisely how much gas was used per trip. I used this in combination with the ignition data to automatically speak, using Android text to speech, the cost of the trip when the ignition is turned off.

You might notice in the video there are a few times where the Android app wrote back to the car. I set the cabin temperature based on the user’s preference stored in the Android app and also forced the hybrid vehicle to switch from gas to electric and vice versa. The write signals are not part of the standard OpenXC platform and were just available at the hack-a-thon using a modified Ford vehicle. It was pretty cool to be able to control the car from my Android phone. We’ll see if Ford adds support for this in the future. The obvious challenge is safety.

I had an idea to enhance the car with audio effects triggered by car events. Turning on high beams triggers a laser sound. Shifting up plays a power up sound and shifting down a power down effect. A water drop signifies each 1/10 gallon of gas consumed. When piped into your car’s audio via a line in or stereo jack from your phone, the experience is compelling.

Materials

cantranslator-with-plug.jpg
DSC02209.JPG
PIXEL Guts.jpg

Required Hardware and Apps

Android App Source Code https://github.com/alinke/PixelOpenXC

Using the OpenXC Android Library

DSC02186.JPG
DSC02182.JPG

The OpenXC Android library documentation and example code is quite good so there’s no reason to repeat here. Here’s the gist on how it works:

You first setup some Listeners for the specific data you’d like to expose in your app. In the example below, I’m capturing vehicle speed and brake status. The OpenXC documentation will tell you how often each data type is polled. For example, speed data comes in at 4 Hz or 4 data points a second while brake data comes only when there is a change meaning when the driver steps on or steps off the brakes.

try {
mVehicleManager.addListener(VehicleSpeed.class, mSpeedListener);
} catch (VehicleServiceException e) {
e.printStackTrace();
} catch (UnrecognizedMeasurementTypeException e) {
e.printStackTrace();
}

try {
mVehicleManager.addListener(BrakePedalStatus.class, mBrakeListener);
} catch (VehicleServiceException e) {
e.printStackTrace();
} catch (UnrecognizedMeasurementTypeException e) {
e.printStackTrace();
}

Then add your specific logic in the listeners.

VehicleSpeed.Listener mSpeedListener = new VehicleSpeed.Listener() {
public void receive(Measurement measurement) {
final VehicleSpeed _speed = (VehicleSpeed) measurement;
MainActivity.this.runOnUiThread(new Runnable() {
public void run() {
speed = _speed.getValue().doubleValue() * 0.621371; //we need to convert km/h to mp/h

if (speed > 75) {

//do something
}

}
});
}
};


BrakePedalStatus.Listener mBrakeListener = new BrakePedalStatus.Listener() {
public void receive(Measurement measurement) {
final BrakePedalStatus _brakeStatus = (BrakePedalStatus) measurement;
MainActivity.this.runOnUiThread(new Runnable() {
public void run() {

boolean brake = _brakeStatus.getValue().booleanValue();

if (brake == true ) {

//do something
}

}
});
}
};

In the case of the speed which is sampled continuously 4 times a second, just remember the code will keep going there 4 times a second when the speed is over 75. So you’ll just need to depending on what you're trying to do, you'll just need to set some flags or setup a timer as your action will repeat 4 times a second otherwise.

Integrating With the PIXEL LED Display

openxc-pixel.jpg

For this project, I leveraged the PIXEL Guts LED display which combined with the IOIO board has built in Android support making the integration with OpenXC quite easy. The Android phone is talking Bluetooth to the OpenXC Vehicle Interface and Bluetooth to the PIXEL LED display. I wasn’t sure how well this would work with parallel Bluetooth connections going, both streaming a fair amount of data. But in fact the setup worked just fine.

You can find more information on using and developing for the PIXEL LED Display.

PIXEL LED Display
Make your own PIXEL LED Display

If you happen to own a PIXEL display and don’t have the OpenXC Vehicle Interface, here’s how to set things up using the OpenXC Enabler simulator allowing you to see the display with real driving data but without hooking up to an actual car.

1. Install the PIXEL Smart Brake Light Prototype app.

2. Install an app called the Ford OpenXC Enabler. This app runs as a service on your Android device and will simulate real driving data to the PIXEL smart brake light app.

3. Install a file manager in Android like the OI File Manager so the Enabler is able to browse for files on the SD card.

4. Copy this file to your SD card. Here are also some other simulated driving data files to try.

Finally, run the
OpenXC Enabler
app on the device and go to
Settings -> Data Sources -> Trace File Playback
, check the
Playback trace file
option and browse for the trace file you copied over.

5. Open the Enabler app, you’ll see a screen with increasing numbers which means the simulated data is working.

6. Bluetooth pair as normal to PIXEL and open the PIXEL Smart Brake Light app. PIXEL should now be displaying various simulated data events: brake light, rapid deceleration (exclamation point), and colored bars which correspond to the gas pedal accelerator. You’ll also hear various sound effects played on the Android device which are triggered by specific car events: high beams on, changing gears, doors open, and ignition on and off.

Mounting the Rear Display

DSC02194.JPG
DSC02191.JPG
DSC02198.JPG
DSC02195.JPG

Nothing magic here, just used some duct tape to attach the LED display to the rear window. The PIXEL LED display is powered with a 5V, high capacity phone power back. In my case, I used this one along with a DC jack to USB cable.

Android App Source Code

github.png

Here's the Android app source code on Github ==> code here .

The app has the following features:

  • Bars on PIXEL LED display proportional to the gas pedal position
  • Brake Light Image
  • Sci Fi sound effects for certain car events (high beams, gear shifts, ignition on and off...)
  • A few ECO features like trip cost and cost of a rapid acceleration
  • Rapid Deceleration Brake Image (using the vehicle speed data)
  • Thank you animation (for someone who lets you merge in) with voice recognition
  • Tongue animation (for the guy who didn't let you merge in) with voice recognition