Impact Recorder for Vehicles

by Ashu_d in Circuits > Raspberry Pi

5436 Views, 43 Favorites, 0 Comments

Impact Recorder for Vehicles

FO1P1J0JXXNWZZ3.LARGE.jpg
Impact Recorder is designed to record impact sustained to vehicle while driving or stationary. The impacts are stored in the database in the form of readings as well as video/picture.
Upon impact remote user can be verified in real time, and remote user can than watch the saved video or take remote access to pi camera and watch events accordingly.

Parts & Accessories

(1) Raspberry Pi 3 or better : Computational power required

(2) Raspberry pi sense hat

(3) Raspberry pi camera / Usb camera

(4) Memory card with latest raspbian image ( Should support node red, almost every latest image does)

(5) Power supply at-least 2.1 A ( i have used battery bank for standalone operation in car)

Parts Description : Sense Hat

sensehat.jpg

The Sense HAT has an 8×8 RGB LED matrix, a five-button joystick and includes the following sensors:

  • Gyroscope
  • Accelerometer
  • Magnetometer
  • Temperature
  • Barometric
  • pressure
  • Humidity

More information on working with sense hat can be derived from following links: Sense_Hat

API for sense hat are hosted at : Sense_hat_API

Code for sense-hat programming is covered in later steps. Sense hat code can also be simulated on a simulator hosted at : Sense-hat simulator

Assembling : Impact Recorder

sense-hat.jpg
IMG_20190710_231731925.jpg
IMG_20190710_231751434 (1).jpg
IMG_20190710_231809681.jpg
  • Assembling is simpler as sense hat needs to be stacked over pi (designated mounting bolts are provided with sense hat) .
  • USB camera or pi camera can be connected. In the tutorial, pi camera is considered and accordingly coding is carried out for the same.
  • Insert the memory card and configure python code and node -red (configuration & code is covered in further steps)

Picture above shows pi-camera connected through flat ribbon cable to pi

Assembling : Impact Recorder on Dash Board of Car

IMG_20190711_143417.jpg

For mounting the recorder, i have used double sided tape , the advantage is recorder can be easily shifted in different position , whichever suits best your car.

Further camera is mounted vertically as shown, using same double side tape,

Next in line is to connect a power source (10,000 mAH power bank) along with a ready internet connection

Internet connection shall be required for MQTT application ( the details for MQTT are covered in further steps)

Impact Recoder : Working & Applications

From the sense hat , acceleration and gyroscope are used to check whether the raw values are beyond set limit in code.

Accelerometer : The accelerometer tells the amount of gravitational force (G-force) acting on each of x,y & z axis, if any axis measures more than 1G force, than rapid motion can be detected. (please note the axis pointing downward would have 1g value and needs to be considered accordingly in python code).

Gyroscope ; The gyroscope is used to measure angular motion, i.e during sharp turn the sensor might get activated (depends upon the setting in the code) , so a person sharply whirling the vehicle would get caught!!

Any activation of set limit is also displayed on the sense hat LED matrix as "!" in red for acceleration & green for gyroscope activation

Software Description : Node Red

Node-RED is a flow-based programming tool, originally developed by IBM’s Emerging Technology Servicesteam and now a part of the JS Foundation.

More information on node red can be obtained through following link :node-red

For our case we would be using node -red for following activities

(1) Interacting with the joysticks to start camera functions

(2) Monitoring the impacts on vehicle and relaying the information to end user by employing MQTT and further accepting the end user commands through MQTT and starting the requisite application on pi

(3) Performing some basic stuffs like shutdown of pi

The further steps gives the detailed information for the flow diagram implemented on node-red

Please note the node-red flow diagrams interact with the python code, hence the latter part covers the python code aspects.

Node-red Basics

node-red_start.JPG
node-red-start1.JPG
node-red123.JPG

Certain Basic steps are highlighted to begin Node-red in a flash, but yes node-red is too simple to begin and work out applications.

  • Starting Node-red : http://localhost:1880.
  • Starting Node-red when pi is connected to internet http:// ip address>:1880

Node-red : Flow _1a

flow_1.JPG

The Flow _1a , monitors any changes in the CSV file and on the basis of the changes , i.e impact detected , camera video recording is set to on mode and further the user is informed over internet that an impact has occurred

Node Red : Flow_1b

flow_1_b.JPG

In the said flow , video recording can be started at any point by just pressing the joystick

Node Red : Flow_2a

flow_2_a.JPG

In the said flow , whenever any new picture or video is stored/uploaded to directory the information is relayed to the registered user over internet

Node Red : Flow_2b

flow_2.JPG

This flow is primarly designed for the remote user , so as to control the device in following manner

(a) shutdown device

(b) take pictures

(c) Record videos

(d) start main code (datalogger code is the main code which calculates the impact)

Node Red ; Flow_3

flow_3.JPG

The flow is designed for local access , so as to start the main code or shutdown device

MQTT

MQTT (Message Queuing Telemetry Transport) is an TCP/IP protocol , wherein publisher and subscriber interact.

In Our case Pi is publisher, whereas the application installed in our moblile/PC shall be the subscriber.

In this way on generation of any impact, information is relayed remotely to the user ( a working internet connection is must)

More information about MQTT can be accessed from following link : MQTT

To start using MQTT , we need to register first , for the tutorial i have used cloudmqtt (www.cloudmqtt.com) , there is a free plan under "cute cat" , thats all.

After registering create a instance say "pi" after which you would be getting following details

  • Server name
  • port
  • username
  • password

The above are required while subscribing through mobile/pc

For my application , i have used MQTT application from google play store (Android version)

MQTT : Subscriber

Screenshot_20190711-235451.png

The MQTT application running on mobile (Android version)

The impact detected on pi are relayed back

MQTT : Editing Properties in Node-red

mqtt_node-red.JPG

In node-red after selecting MQTT node , "Server name" and "topic" to be mentioned .This should be same on subscriber end

The Python Code :

The code functionality is as per attached flowchart

The Final Code

The python code is attached

In order to make our python script run from terminal, we need to make them executable as chmod +x datalogger.py , than further the top of the code should contain the following "shebang" line #! /usr/bin/python3 (this is required so as to execute functions from node-red)

#!/usr/bin/python3 // shebang line
from sense_hat import SenseHat from datetime import datetime from csv import writer import RPi.GPIO as GPIO from time import sleep

sense = SenseHat() import csv

timestamp = datetime.now() delay = 5 // delay is defined to store data in data.csv file red = (255,0,0) green = (0,255,0) yellow = (255,255,0)

#GPIO.setmode(GPIO.BCM) #GPIO.setup(17,GPIO.OUT)

def get_sense_impact(): sense_impact = [] acc = sense.get_accelerometer_raw() sense_impact.append(acc["x"]) sense_impact.append(acc["y"]) sense_impact.append(acc["z"])

gyro = sense.get_gyroscope_raw() sense_impact.append(gyro["x"]) sense_impact.append(gyro["y"]) sense_impact.append(gyro["z"])

return sense_impact

def impact(): // function to detect impact #GPIO.setmode(GPIO.BCM) #GPIO.setup(4,GPIO.OUT) acceleration = sense.get_accelerometer_raw() x = acceleration['x'] y = acceleration['y'] z = acceleration['z'] x=abs(x) y=abs(y) z=abs(z)

gyro = sense.get_gyroscope_raw() gyrox = gyro["x"] gyroy = gyro["y"] gyroz = gyro["z"]

gyrox = round(gyrox,2) gyroy = round(gyroy,2) gyroz = round(gyroz,2)

impact = get_sense_impact()

if x > 1.5 or y > 1.5 or z > 1.5: // the values are set after iteration on actual road can be changed accordingly for different types and driving skills with open('impact.csv', 'w', newline='') as f: data_writer = writer(f) data_writer.writerow(['acc x','acc y','acc z','gyro x','gyro y','gyro z']) #GPIO.output(4,GPIO.HIGH) sense.clear() sense.show_letter("!" , red) data_writer.writerow(impact)

elif gyrox > 1.5 or gyroy > 1.5 or gyroz > 1.5: // the values are set looking into the speed at which turns are initiated with open('impact.csv', 'w', newline='') as f: data_writer = writer(f) data_writer.writerow(['acc x','acc y','acc z','gyro x','gyro y','gyro z']) #GPIO.output(4,GPIO.HIGH) sense.clear() sense.show_letter("!" , green) data_writer.writerow(impact)

else: # GPIO.output(4,GPIO.LOW) sense.clear()

def get_sense_data(): // function to record and store values from sensor sense_data = []

sense_data.append(sense.get_temperature()) sense_data.append(sense.get_pressure()) sense_data.append(sense.get_humidity())

orientation = sense.get_orientation() sense_data.append(orientation["yaw"]) sense_data.append(orientation["pitch"]) sense_data.append(orientation["roll"])

acc = sense.get_accelerometer_raw() sense_data.append(acc["x"]) sense_data.append(acc["y"]) sense_data.append(acc["z"]) mag = sense.get_compass_raw() sense_data.append(mag["x"]) sense_data.append(mag["y"]) sense_data.append(mag["z"])

gyro = sense.get_gyroscope_raw() sense_data.append(gyro["x"]) sense_data.append(gyro["y"]) sense_data.append(gyro["z"])

sense_data.append(datetime.now())

return sense_data

with open('data.csv', 'w', newline='') as f: data_writer = writer(f)

data_writer.writerow(['temp','pres','hum','yaw','pitch','roll','acc x','acc y','acc z','mag x','mag y','mag z','gyro x','gyro y','gyro z','datetime'])

while True: print(get_sense_data()) for event in sense.stick.get_events(): # Check if the joystick was pressed if event.action == "pressed": # Check which direction if event.direction == "up": # sense.show_letter("U") # Up arrow acceleration = sense.get_accelerometer_raw() x = acceleration['x'] y = acceleration['y'] z = acceleration['z'] x=round(x, 0) y=round(y, 0) z=round(z, 0)

# Update the rotation of the display depending on which way up the if x == -1: sense.set_rotation(90) elif y == 1: sense.set_rotation(270) elif y == -1: sense.set_rotation(180) else: sense.set_rotation(0) sense.clear() t = sense.get_temperature() t = round(t,1) message = "T: " + str(t) sense.show_message(message,text_colour = red,scroll_speed=0.09) elif event.direction == "down": acceleration = sense.get_accelerometer_raw() x = acceleration['x'] y = acceleration['y'] z = acceleration['z'] x=round(x, 0) y=round(y, 0) z=round(z, 0)

# Update the rotation of the display depending on which way up the if x == -1: sense.set_rotation(90) elif y == 1: sense.set_rotation(270) elif y == -1: sense.set_rotation(180) else: sense.set_rotation(0) # sense.show_letter("D") # Down arrow sense.clear() h = sense.get_humidity() h = round(h,1) message = "H: " + str(h) sense.show_message(message,text_colour = green,scroll_speed=0.09) p = sense.get_pressure() p = round(p,1) message = "P: " + str(p) sense.show_message(message,text_colour = yellow,scroll_speed=0.09)

# elif event.direction == "left": # acceleration = sense.get_accelerometer_raw() # x = acceleration['x'] #y = acceleration['y'] #z = acceleration['z'] #x=round(x, 0) #y=round(y, 0) #z=round(z, 0)

# Update the rotation of the display depending on which way up the // Not used and controlled by node-red #if x == -1: sense.set_rotation(90) #elif y == 1: sense.set_rotation(270) #elif y == -1: sense.set_rotation(180) #else: sense.set_rotation(0) #sense.show_letter("L") # Left arrow # elif event.direction == "right": # sense.show_letter("K") # Right arrow # elif event.direction == "middle": # sense.clear()

impact() data = get_sense_data()

dt = data[-1] - timestamp if dt.seconds > delay: data_writer.writerow(data) timestamp = datetime.now()

Downloads

Monitoring Live Video

Impact Recorder can also be used to monitor live video, as video can be started anytime anywhere through MQTT

we would use VLC player to stream videos, by default in latest raspbian the VLC is pre-installed, else install vlc as under

More information about viewing network stream can be accessed through VLC Network stream

Thank you for reading!!

There is much more the impact recorder can do..

Watch out next space for magnetic field analysis in carrying out obstacle mapping