H4E1-Name AID
Group 7 Cooking Interface for the Blind
Our task was to make it easier or find a user-friendly way for blind people to use a cooker.
The person we brainstormed and made our prototype for is blind. He can only see completely white images before his eyes or completely black. This person is also less sensitive in his fingers and therefore has limited sensation. Therefore he cannot read Braille.
Our product makes it possible to use an induction cooker with touch control. The disadvantage here was that the person in question could not feel where the control buttons were placed correctly. With our product, these can be easily found and when a button is pressed, the person receives live feedback about the position of the fire.
Stap 1: Brainstorm and Idea Generation
First, we started generating ideas, making sure that the touch problem was solved and that the user would also be able to hear how hot the fire was.
Our first designs had the problem that the cooktop would have to be replaced in its entirety and would therefore be a major investment. The customer or person in question would have preferred it to be an application for his old cooker. That is why we opted for concept 3.
This concept does not have a direct sensor integrated into the cooker. But with the help of its built-in computer and because the operation takes place via the product, the product knows exactly what setting your cooker is on and can pass this information on to the user.
Quick Dirty First Prototype
After choosing the concept, we quickly made a visual idea in cardboard, in order to better represent the idea and make any necessary adjustments.
Step 3: Product Programming
Since the micro bit is limited in components and cannot apply so many functions, we did some independent testing. Since we programmed in two different programs, we could not make them all work together.
Our first program that you can see with the block structure and grey background makes sure that when a plus or minus is pressed on the cooker it remembers the positions from 0 to 9. This is displayed on the microbit itself. But for us, this was not the purpose, we used this function so that when the microbit enters a certain heat, it can report this to the user.
For example, when it is set to 7, the product will tell the user that the fire is set to 7 via audio.
You can see the audio part on the third photo with the red text in it.
Here we have tested that when you press one button, it will tell you a certain sentence and when another button is pressed, it will tell you other information.
So when the programme was all set up, we were able to process this so that it tells you what value the variable is at that moment.
There is also a button that you can press that will overflow all the fires and tell you the correct heat of them.
Step 4: Code Testing
The disadvantage of the speaker that we had was that it was not very clear in communicating, but the principle was correct and the code behind it was correct.
The second picture shows a test of changing the heat of a fire. The aluminium replaces the touch screen here
Materials
For the frame of our prototype, cardboard is used and the push button with touch sensitivity on the other side is now replaced by touch pen part.
The components that we use for technical aspect are:
- Push buttons
- speaker
- micro bit
- Mi node
- element I4
- Cables
- Pressure sensor
Step 5: Final Prototype
In the picture you can see the product that has to be mounted on the hotplate. Here you can also see the push buttons that send information to the micro bit and the touch pen that is pressed against the hotplate.
The moment the value is changed, a voice says "Fire 3 is on heat 7".
On the left is also an on and off button for the hot plate and also a general information button. This tells you the heat values of all four fires at the current moment.