Oscar – Phase 3 – Technical Journey driven by User Testing

The technical journey of Oscar has seen many iterations and milestones. Looking at the designing of and building of the code, Oscar has seen many iterations. The major iterations are listed below:

1. BrightnessThresholding: Testing image thresholding (values)
2. Face Detection V1: Testing with face detection
3. Face Detection V2: Face detection with remapping the face to fit the screen
4. Face Detection V3: Face detection, remapping and incorporating brightness thresholding
5. Kinect addition: Testing with the Kinect depth image
6. Servo addition:: Servo testing program
7. Stepper Control: Stepper control code (GRBL/G-Code feedback loop)
8. Complete iteration 1: OscarDraw V1: First try to create a drawing with Oscar
9. Complete iteration 2: OscarDraw V1_2: Added remapping image to fit Oscars dimensions
10. Complete iteration 3: OscarDraw V1_3: Added GRBL feedback loop (StepperControl Code), borders (V1_3_2)
11. Complete iteration 4: OscarDraw V1_4: Minor efficiency update (GRBL Homing)
12. Complete iteration 5: OscarDraw V1_5: Added Kinect interaction (draw speed)
13. Complete iteration 6: OscarDraw V2: Added Kinect as default camera, improved efficiency and interaction patterns
14. complete iteration 6: OscarDraw V1_3_2:

User testing was the dominant driver towards new iterations. The user testing experience which Oscar has been unlike any other product we had tested before. There were many fixes to do after each individual user test and while we had hope to get more user tests done, we ended up doing 10 complete, successful user tests. We attempted to do more user tests, however we had failures with hardware due to ware/tear, stability, flexibility and range of motion which stopped us on many occasions.

Result of Complete iteration 1 user test: The eye is drawn by us in pencil (showing what we thought Oscar would draw). However, what we ended up drawing were just straight lines mostly due to the fact that our x axis was unstable and so we chose to complete this first user test without the x axis variable. We also found that there was risk that the pen-cart and also the x axis cart would go beyond the rails, and this started to compromise the structural integrity of Oscar. As a result we considered putting in safety stop buttons. These came in in iteration 4.
Iteration 1

Result of complete iteration 2 user test: This was the first complete image we created with Oscar. We found that the x-axis improvements were insufficient and also that the code was “drawing” over the white areas as well, albeit with the pen elevated. This resulted in an incredibly long time taken to make the drawing. This drawing took 1.5 hours to complete.
Iteration 2
The 1.5 hours of drawing took a toll on the y axis pulley, and we can see it worn away in the image below.
wear and tear
We had a lot of learning from the Complete iteration 2 as to the amount of time needed to draw one artwork. Given it takes 1.5 hours, we doubt we could achieve user interest and user interaction for every person that came to Oscar. From this factor, we had to start re-considering our interaction conceptualisation.

Result of complete iteration 3 user test:
We attempted to add border drawing into the code so that Oscar would frame the artwork and so it would have the better aesthetic to be taken away. This was also our initial idea that if no one was near Oscar, that he would draw frames to busy himself. When we incorporate this into the code though, we found that we froze the facial drawing element and we have many,Iteration 3 many frame drawings only from the user test. We had to spend a lot of time on the code to understand why this was the case.

Result of complete iteration 4 user test:
From our experiences with the carts driving off the axes, we installed safety buttons on each of the extreme ends of all axes. We encountered a problem with the code however, once the button was pressed, we could not work out how to get Oscar to learn that it’s the edge of the axis and continue drawing. Instead, Oscar would just stop (and not go beyond the axes) and we had to manually bring the axes back to (0,0).
Stop buttons

Result of complete iteration 5 user test:

Adding Kinect sensor to direct the speed of the drawing added new problems. We again caused the code to freeze up and beyond the frame drawing, we did not progress to the drawing of the face image. We abandoned this addition code as it just created new frame only drawings as per above.

Result of complete iteration 6 user test:
We removed the distance to speed of drawing relationship and reverted to the previous code, however with the image being read from the Kinect, this resulted in better images, however we can still see the physical challenge of the x axis slipping.
Iteration 6
What we found from the drawing above is that each line is being drawn twice resulting in a darker image.

Result of complete iteration 7 user test:
Based on the learnings from user test 6, we sought to remove the drawing duplication in the code. The image which resulted however was quite scary and not recognisable, as such we reverted back to iteration 6 code.
iteration 7

Overall our findings were that Oscar V1_3_2_1 has been the most stable code that has been used to produce most of the drawings made so far, despite being slow it worked well and produced well drawn images without errors. We completed a couple more user tests with this most stable code.
stable code 2stable code 3


Summary of key technical challenges for Oscar were:

Control of the stepper motors: Figuring out how we would control the stepper motors that we were using for the drawing machine was the first important threshold. Rob supplied us with the stepper shield that would translate input control and power supply to actions done by the stepper. Having that sorted out the next step was how to send commands to this shield. Knowing we were using a computer with Processing for the image capturing and processing it made sense that we needed something that could interact with this. And as we have been using Processing with Arduino in the Lab this was the choice we made for the control.
Another thing that supported the choice for Arduino was that this platform provides with stepper control libraries. However these are quite limited and it soon became apparent that this is not the best choice for our control. This is when Rob showed the advantages of using the GRBL library that is designed for stepper control in 3D Printers. The advantage of GRBL is that it is specifically designed for controlling stepper motors. It only requires the input of G-Code, which is a string with the coordinates, and the library then takes care of the actual control (acceleration/number of steps, etc.) of the stepper motors, taking away the need for difficult calculations on my behalf to create smooth and accurate motion.

Synchronising stepper and servo motors: Having made the choice for a servo motor, for putting the pen on the paper and make Oscar actually draw, created a problem relating synchronism between the stepper control and the servo control. As the GRBL isn’t able to control the servo it needed a separate Arduino for it’s control. The issue now was created is that while GRBL is able to store multiple commands in it’s buffer, and smoothing the motion between these commands, the stepper control is direct and has no delay. Using these two control systems it could happen that the servo might go down while the steppers haven’t reached the point where it should happen due to having commands in it’s buffer that it executes prior. This synchronisation issue was solved by including a feedback loop with GRBL to check that it is actually at the position where the pen should move before moving the pen.

Image Processing: In order to have Oscar draw it of course needs something to draw. Here is where Processing comes into play with the image processing. Because we made the decision in an early stage to start drawing straight pixel conversions from the image captured (made because this was the easiest method for drawing), and the ability for Oscar to only draw either black or nothing (white), means that image needs to contain either black or white pixels. To get this effect we have been using the image thresholding effect, and using the provided thresholding example only the levels need to be set to create the right effect.
Having done that first step, it became apparent that we needed extra image processing as the result in general provided with lots of void in the image. Solving this problem was done by face recognition and focussing only on this area of the image. Oscar effectively becomes a portrait drawing machine by focussing on the face of the user interacting with it.

Drawing methods: As stated in the Image Processing piece the first decision was made to focus on controlling the stepper and servo motors, and therefore the way it draws isn’t much exciting but more focussed towards effectiveness. It resembles a printer as builds the image line by line until it is complete.

Advertisements

One response to “Oscar – Phase 3 – Technical Journey driven by User Testing

  1. Pingback: Oscar – Final Documentation | IDEA Studio 2 | Creative Robotics·

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s