Code Smart 11 14 Teacher Book
Code Smart 11 14 Teacher Book
Computing I Ages 11 - 14
Teacher Book
A resource by Encounter Edu
Acknowledgements
Encounter Edu would like to thank Chip Cunliffe, Ashley Stockwell, and Richard Jinks
from AXA XL; Prof Paul Newman, Prof Ingmar Posner, Dirk Gorissen, and Joonas Melin
from Oxbotica; Kenny Wang from Makeblock; Helen Steer and Rehana Al-Soltane
from Do It Kits; and Nat Hunter from Machines Room.
Sponsored by
AXA XL[1]
AXA XL1, the property & casualty and specialty risk division of AXA, provides
insurance and risk management products and services for mid-sized companies
through to large multinationals, and reinsurance solutions to insurance companies
globally. We partner with those who move the world forward. To learn more, visit
www.axaxl.com
AXA XL is a division of AXA Group providing products and services through four business groups: AXA XL Insurance, AXA XL
1
With thanks to
Oxbotica
Oxbotica, a leading driverless software startup based in Oxford, is developing
the brain behind self-driving cars. Our autonomous operating system, Selenium,
is platform agnostic and navigates autonomously around cities, warehouses and
off-road environments, using data from lasers and cameras placed on vehicles.
The software learns from both its own experiences, and those of other vehicles.
MakeBlock
MakeBlock Co., Ltd, founded in 2013, is a leading STEAM education solution provider.
Targeting the STEAM education and entertainment markets for schools, educational
institutions and families, Makeblock provides the most complete hardware, software,
content solutions, and top-notch robotics competitions, with the aim of achieving
deep integration of technology and education.
Contents
Foreword page 1
Overview page 2
Lessons
Welcome to
Code Smart
AXA XL has partnered with Oxbotica in the development of risk solutions related
to autonomous vehicle software. Our notion is simple: as a global (re)insurer,
building future focused risk solutions is part of our mission.
This Code Smart Teacher Book introduces our Code Smart resources for ages
11 to 14, with the aim of raising literacy in coding and smart cities and inspiring
a new generation.
Chip Cunliffe
Director, Sustainable Development
AXA XL
We are delighted to have you join us in thinking about the future - today.
About
Code Smart
Code Smart works with the mBot STEM robot and the mBlock
scratch-based coding platform from Makeblock. Over the
following pages, you will find detailed teacher guidance, student
sheets and answer sheets, as well as background information in
the form of Subject Updates. Further supporting resources such as
videos and slideshows are available to download from the Code
Smart website.
Applicable standards
The national curriculum in England
KS3 Computing
Lessons
Element of the curriculum 1 2 3 4 5 6 7 8 9
Applicable standards
International Society for Technology in
Education (ISTE) Standards for Students
Lessons
Element of the curriculum 1 2 3 4 5 6 7 8 9
Video:
Uploading code to your robot
Lesson 3: Where am I?
Overview Learning outcomes Resources
In the third lesson of this unit of • Understand why mapping is Slideshow 3:
work, your class will learn about important for programming Where am I?
the role of mapping in autonomous autonomous cars
vehicles. They will then be • Describe functions carried out by Student Sheet 3a:
challenged to make their robot software and hardware Explaining the line follower
vehicle follow a path using new • Describe how a line follower
Student Sheet 3b:
hardware and software. Students functions
will learn about how the line follower Coding the line follower
• Apply logic to get a robot to follow a
sensor on the mBot works, then use defined path Answer Sheet 3b:
the sensor data and logic-based • Debug programs Coding the line follower
code to autonomously control their • Identify and discuss ways of solving
robots. Finally, they will relate this the same problem under different Video:
activity to real world driverless conditions Where am I?
vehicles and discuss the other
sensors needed to make driving
efficient and safe.
Teacher guidance
The Teacher Guidance for each lesson uses a set of icons to provide visual cues to support teachers:
Lesson activities
Explain
teacher exposition using slides or script to support
Demonstration / watch
students watch a demonstration or video
Student activity
activity for students to complete individually such as questions on a Student Sheet
Group work
activity for students to complete in pairs or small groups
Lesson Overview 1
Teacher Guidance 1
Student Sheet 1a: What are the differences between robots and humans?
Robot cars
Age 11-14
(Key Stage 3)
Computing In the first lesson of this unit of work, we will introduce your class
• Understand the hardware and to the concept of robotics and autonomous cars. Your students
software components that
make up computer systems,
will discuss what cities in the future will look like, and think about
and how they communicate the role that robots and autonomous vehicles will play. Next, your
with one another and with class will work in groups to complete the main challenge of the
other systems first lesson: building the mBot, then getting it moving around the
classroom with the remote control.
Resources
Lesson steps Learning outcomes
Slideshow 1:
Robot cars and smart cities 1. 360 video opener (5 mins)
Introduction to smart cities and • Name a benefit of smart cities
Student Sheet 1a:
autonomous vehicles. Students • Describe characteristics of an
What are the differences
get 360 insight to a driverless car autonomous vehicle
between robots and
to provide context for this lesson’s
humans?
challenge: to build a robot vehicle.
Video:
Building your robot 2. Classroom discussion (10 mins)
Students will discuss the human • Compare characteristics of robots
360 Video: responses to the three questions of and humans
Look! Driving with no hands! autonomy. Then they will discuss
additional differences between
Image: robots and humans.
Comparing my mBot to a
driverless car interactive
3. Make (30 mins)
Subject Updates: Build the mBot as per the • Construct a robot
• What is a smart city? instruction manual and guidance • Name the basic parts of the robot
video.
• How can autonomous
vehicles be useful?
4. Test (10 mins)
• Troubleshooting guide Test the mBot works with the • Control a robot using a simple
• mBot’s default program remote control. remote
5. Reflect (5 mins)
Kit (per group) Students will reflect on their • Identify and share problems
learning, including problems they encountered with solutions
• mBot kit
had and how they solved them.
• Note batteries are not included
in mBot kits:
• mBot requires either 4 x 1.5V
AA batteries or a 3.7V DC
lithium battery
• Makeblock remote control
requires a CR2025 coin cell
battery
Step
Where am I?
· They would use their senses.
· They would use their eyes to look around.
· They might use their ears to listen to their
surroundings.
· For example, if you accidentally fall asleep on
the train and wake up uncertain where you
STUDENT SHEET 1a
Step
What do I do next?
· They might follow directions to the destination
based on what they know already or based on
a map.
· They will also follow the rules of the road. For
example, if you’re driving then you shouldn’t
drive through a red light.
· They will also be aware of other road users and
predict what they might do based on signalling,
so they don’t crash.
Step
Step
Where am I?
· Does this robot know where it is? No, not yet.
· What would we have to add to this robot for it
to know where it is? Sensors to act as its eyes
and ears.
What do I do next?
· Does this robot know what to do next? No, not
yet.
· What would we have to add to this robot for
it to know what to do next? We would have to
program it (give it a set of instructions).
3. (a) What are the risks of robots and humans sharing the road?
Lesson Overview 2
Teacher Guidance 2
Controlling cars
Age 11-14
(Key Stage 3)
Computing In the second lesson of this unit of work, we will introduce your
• Understand the hardware and class to the concept of using code to control cars. Your students
software components that
make up computer systems,
will discuss how programming a car compares to programming
and how they communicate a standard computer. Next, your class will work in groups to
with one another and with other complete the main challenge of the second lesson: programming
systems the robot to drive in different shapes around the classroom.
• Use a programming language
to solve a variety of
computational problems Lesson steps Learning outcomes
Step
1. What is code?
2. What does code do?
3. Is programming a car the same as
programming a standard computer?
· The students might think it would be
completely different, but we can actually
use the same or similar programming
languages, using similar logic.
Step
Step
Step
Electric motors
explainer
To drive your robot car you will be using code to control the motors. The type of motor
you will be controlling is a DC motor, but it’s good to know there are other types of
motors that you may encounter.
All these motors make use of something called an H bridge to be able to drive the
motor backwards and forwards. The servo is the only motor to have it built in. The
mBot has a DC motor and the board we will program to control the motors has an
H bridge.
Controlling cars
with code
1. In the robots blocks under mBot drag If you don’t see this block
out the block and put it at the top of click on the Boards menu
and make sure mBot
your program. All our mBot programs (mCore) is selected.
must start with this block.
3. You might think our program is now Do you think this will work
as it is? This will cause the
finished but if we load that on to the robot to turn the motor
robot it will never turn the engine off on then off again straight
away, so the robot won’t
and will keep running forward until we move.
turn it off or the batteries run out. We
also need to add in the same block
with speed set to 0.
the floor.
More challenge
14. Can you make the robot drive in a Driving in a circle is kind
of like driving a wide turn.
circle?
Solution to drive the motor forward and then reverse back to same position.
5
Line 5-6: The amount of
6 time to turn right for will
vary depending on the
surface the robot is being
7 tested on.
Note that the angle of the turn is dependent on the speed of the engine and the time it is on for.
Also note that the same program might not work the same on all surfaces due to friction.
Drive in a square by controlling each motor individually. This is essentially the same as the last
program but we need a block to control each motor. M1 is the left engine (when looking from the
back) and M2 the right.
When turning, one wheel moves forward and the other wheel moves backward at the same speed to
turn on the spot. For example, to turn right the left wheel moves forward and the right wheel moves
backward. It is possible to turn right by just moving the left wheel but this would make it a much wid-
er turn and impossible to drive in a square.
1
2
3 Line 3-4: set both motors to
speed 100.
4
5
Line 6-7: set motor 1 to go
6 forward at speed 100 and
motor 2 to drive in reverse
7 at speed 100. This will cause
the robot to turn right on
the spot.
8
9
10
Drive in a circle
To drive the robot in a circle we just need to set the motor speeds to be different. The difference
between each of them will affect the size of the circle. To drive clockwise we make the left wheel
drive faster than the right.
1
Line 2-3: set the motors to
2 different speeds.
Drive in a hexagon
Driving in a hexagon is similar to driving in a square, but we have to draw 6 sides and adjust the
angle.
1
2
Line 2: Repeat the set of
3 blocks inside this repeat
block, in order, 6 times.
4
Line 3-7: Draws 1 line of the
hexagon and turns to be at
5 the beginning of the next
line.
6
Line 5-6: The amount of
time to turn right for will
7 vary depending on the
surface the robot is being
tested on. It should be less
time than for driving in a
square.
Drive in a pentagon
Driving in a pentagon is similar to driving in a square, but we have to draw 5 sides and adjust the
angle.
1
2 Line 2: Repeat the set of
blocks inside this repeat
3 block, in order, 5 times.
Lesson Overview 3
Teacher Guidance 3
Slideshow 3: Where am I?
Video: Where am I?
60 minutes
Computing In the third lesson of this unit of work, your class will learn about
• Understand the hardware and the role of mapping in autonomous vehicles. They will then be
software components that
make up computer systems,
challenged to make their robot vehicle follow a path using new
and how they communicate hardware and software. Students will learn about how the line
with one another and with other follower sensor on the mBot works, then use the sensor data and
systems logic-based code to autonomously control their robots. Finally,
• Understand simple Boolean they will relate this activity to real world driverless vehicles and
logic [for example, AND, OR and
NOT] and some of its uses in
discuss the other sensors needed to make driving efficient and
circuits and programming safe.
• Use a programming language
to solve a variety of
computational problems Lesson steps Learning outcomes
Step
Step
Explaining Coding
the line follower the line follower guide for the information you are about to present.
LED Photodiode LED Photodiode can detect a white surface (within the range of
1-2cm). It works by emitting IR (infrared) light and
4. In the hexagonal space, we need to The equals block is green
so it’s in the operator type
add an operator block. We want the blocks.
block that tells if the line follower
The LED (light emitting diode) shines IR A dark surface only reflects a little light.
output equals something. Remember To change the number
Step
Step
Explaining
the line follower
The line follower is on the bottom of the mBot at the front. The line follower has 2
sensors which can detect whether a surface is black or white within a range of 1-2cm.
If you look at each sensor you can see the LED (dark bulb) at the back and the
photodiode (clear bulb) at the front.
The LED (light emitting diode) shines IR A dark surface only reflects a little light.
(infrared) light. Any light that is reflected Also, if the LED is far from a surface,
off a surface is received by the only a little light will be reflected.
photodiode. A white, surface shown
here, reflects a lot of light.
Code Smart 11-14 34 Copyright 2018
Encounter Edu This resource may be reproduced
for educational purposes only
STUDENT SHEET 3a
Coding
the line follower
Today we are going to learn how to code a car to follow a black line.
Let’s get moving!
1. In the robots blocks under mBot drag If you don’t see this block
click on the Boards menu
out the block and put it at the top of and make sure mBot
your program. All our mBot programs (mCore) is selected.
11. Once the program is loaded onto the Be careful it doesn’t run
off the desk or table.
robot it will run straight away so place
it on the floor.
13. There are multiple ways to check the Remove both of the
equals operator blocks.
sensors. Let’s try using Boolean logic. You’ll only need one
You will need this and block. This will and block to check both
sensors.
return true only if both parts are true.
2
3 Line 3-4: If-else block. If both sensors can
see the black line then drive forward.
4
5 Line 5-9: Else another if-else block which
adjusts the robot’s direction so both sensors
6 are on the line again. If only sensor 1 can see
the black line then we have to turn slightly
7 to the left till both sensors can see the
black line again. Otherwise, we are too far
8 to the left and have to turn right to fix the
robot’s direction.
This is more or less the same as the last program but uses a different method of checking the
sensors.
Line 3: This is true if both conditions are
true, i.e, if condition 1 AND condition 2 are
true. So this is only true when both sensors
1 see black. This is same as saying line
follower (port2) = 1.
2
3
4
5
6 Line 6: If only left sensor see black.
7
8 Line 8: Otherwise only right sensor must see
black.
9
Lesson Overview 4
Teacher Guidance 4
60 minutes
Computing In the fourth lesson of this unit of work, your class will learn about
• Understand the hardware and obstacles and sensors. They will discuss the risks that a smart city
software components that
make up computer systems,
presents, focusing on the challenges that an autonomous vehicle
and how they communicate faces while navigating in the real world. They will then learn about
with one another and with other the ultrasonic sensor on board the mBot and how to use it to avoid
systems obstacles. Finally, they will think about how driving speed can
• Understand simple Boolean influence a vehicle’s ability to react to obstacles.
logic and some of its uses in
circuits and programming
Lesson steps Learning outcomes
• Use a programming language
to solve a variety of
computational problems 1. Video opener (5 mins)
An introduction to smart cities and • Describe what smart cities are
risk. The video will set this lesson’s and give examples of smart city
challenge: to get the robot to initiatives
notice and react to obstacles.
Resources
2. Classroom discussion (10 mins)
Slideshow 4: Students will discuss the video. • Describe risks for autonomous
What’s around me? What risks can they think of that vehicles in smart cities
their autonomous car will need to
Student Sheet 4a: notice and avoid?
Explaining the ultrasonic
sensor 3. Make (25 mins)
Introduction to the ultrasonic • Describe how an ultrasonic sensor
Student Sheet 4b:
sensor hardware and how to control functions
Coding the ultrasonic
it with code. Students should code • Apply code and sensor output to
sensor
the robot to stop when it detects navigate around an obstacle
Answer Sheet 4b: an obstacle, then react and
Coding the ultrasonic manoeuvre around obstacles of
sensor known and unknown sizes.
Step
Step
Step
Explaining
the ultrasonic sensor
The ultrasonic sensor measures distance. It is on the front of the mBot and looks like
a set of eyes. We can use it to detect obstacles.
One of the ‘eyes’ transmits a sound, and the other waits for the echo of the sound to
return. From the time this takes, the distance of the object from the sensor can be
calculated. The ultrasonic sensor has a range of 3-400cm.
Coding the
ultrasonic sensor
1. In the robots blocks under mBot drag If you don’t see this block
click on the Boards menu
out the block and put it at the top of and make sure mBot
your program. All our mBot programs (mCore) is selected.
we set.
10. Once the program is loaded onto the Be careful it doesn’t run
robot it will run straight away so place off the desk or table.
it on the floor.
More challenge
12. Can you write the code for another
way to avoid the obstacle? Try to How could you do this?
What options do you
come up with as many as you can. have?
16. This is how your code might start. Don’t forget! You may
need the wait block to
Complete the code to tell your make sure your robot has
robot what to do if the number enough time to complete
your instructions.
equals 0 and what to do if it
doesn’t equal 0.
5
6 Line 5-6: Else, drive forward.
The robot could turn left or right when it encounters an obstacle. The simplest solution is to just turn
left every time or just turn right every time.
Line 2: Forever loop. Keep
running the code in this
loop as long as the robot
1 has power.
2
Line 3-4: If-else block. If
3 an obstacle is detected
less than 20cm away (you
4 may want to make this
larger than the check in
5 the first program so the
robot has enough time
6 to turn), turn left. This
will keep happening until
the obstacle is no longer
detected as the if condition
will constantly be checked
as it’s in a forever loop.
Method 2: Reverse then turn one direction whenever the robot detects an obstacle
If an obstacle is too close or appears suddenly (if we place an obstacle in front of the robot while it’s
driving), it may be necessary to have the robot reverse and then turn later. Line 2: Forever loop. Keep
running the code in this
1 loop as long as the robot
has power.
A more complex solution would see the robot picking at random whether to turn left or right each
time it encountered an obstacle.
Line 2: Forever loop. Keep
running the code in this
1 loop as long as the robot
has power.
2 Line 3-5: If-else block. If an
obstacle is detected less
3 than 20cm away, pick a
number between 0 and 1
4 randomly. If the number is
0 turn left.
5
Line 6: Without the wait
6 block, the robot will never
make a full turn unless it
7 randomly generates the
same number repeatedly,
8 which wouldn’t make it a
very good random number
9 generator.
Method 4: Reverse then turn a random direction whenever the robot detects an obstacle
This last solution is more complex and makes the robot reverse slightly before performing a random
turn.
1
2
3
4 Lines 4-5: These are
5 the only blocks that
are different. They just
6 make the robot reverse
for 1 second first before
7 randomly turning left or
right.
8
9
10
11
12
13
If an obstacle is too close or appears suddenly, an appropriate solution would require reversing
before turning. See method 2 or 4 above.
If we want the robot to turn randomly rather than just one direction, then a random number
generator is necessary. See method 3 and 4 above for solutions that make the robot turn randomly.
Lesson Overview 5
Teacher Guidance 5
60 minutes
Computing In the fifth lesson of this unit of work, your class will learn about
• Understand the hardware and how technology and people interact. They will learn about
software components that
make up computer systems,
signalling movement and giving warnings with light and sound.
and how they communicate Students will use code to control their robot car’s LEDs and buzzer
with one another and with other to produce lights and sounds for a variety of different scenarios.
systems They will then combine an input — the ultrasonic sensor — and
• Understand simple Boolean an output — the LEDs and buzzer — to create a proximity sensor.
logic and some of its uses in
circuits and programming
Finally, they will discuss the other sensors and signals they think
• Use a programming language would be useful for their robots and autonomous vehicles in
to solve a variety of real life.
computational problems
Step
Step
Each set of LEDs has three separate colours. We can control the overall colour shown
by combining the strength of each colour, going from 0 for no light, to 255 for full
Red
Blue Green
255 255 0 Yellow
Cyan
255 255 255 White
Step
In this lesson, you will code the robot to respond to remote control steering. You will
also operate the LED lights and buzzer based on the steering. Then you’ll also create
a proximity sensor by combining the input from the ultrasonic sensor and the output
from the LEDs and buzzer.
Let’s get moving!
Each set of LEDs has three separate colours. We can control the overall colour shown
by combining the strength of each colour, going from 0 for no light, to 255 for full
power. The table below shows how to mix the lights to make red, yellow and white.
Red
Magenta Yellow
Red Green Blue Result
Blue Green
255 255 0 Yellow
255 255 255 White
Cyan
Code Smart 11-14 59 Copyright 2018
Encounter Edu This resource may be reproduced
for educational purposes only
STUDENT SHEET 5a
Next we are going to use the ultrasonic sensor to add a proximity sensor to your
robot. This will act like parking sensors to notify you if there’s an obstacle and tell you
roughly how close you are to it, using different notes and beats.
There are a few ways to code remote control steering, but we have provided starter code for the
“best” way to do it: nested if-else statements, or if-else statements within each other.
However, let’s look at 3 seemingly logical examples to code steering and some of their problems
because some of your students might try these methods.
Example 1
If the code uses several separate if statements and looks like this…
2
3 Lines 3-4: If the button with
the up arrow is pressed then
4 drive forward at full speed.
Then the if statements blocks will compete with the stop block. This will cause the robot to behave
strangely.
Example 2
There are some different approaches we can take to solve this such as waiting for 1 second after
each direction button pressed (see below). However, this means we have to drive in that direction for
a whole second. We may not want to drive in that direction for a whole second but rather only for a
fraction of a second. For example, to stop driving off a table. So this program works better than the
previous one but still isn’t great.
1
2
3
4
5
6
7
8
Lines 5, 8, 11, 14: Wait for 1
second before checking
again for button presses.
9
10
11
12
13
14
15
Example 3
So what about adding an extra button to stop? This would work but makes the robot harder to
control.
1
2
3
4
5
6
7
8
Lines 11-12: If the B button
is pressed then stop the
9 robot. Otherwise it will just
10 keep doing what it was told
to do by the last button
press. For example, if you
pressed forward the robot
11 will keep moving forward
until another direction
12 button is pressed or you
press the B button. This
didn’t happen for the first
program as the robot would
always be made to stop
after each button press, as
the command wasn’t in an
if statement.
2
3 Line 3-4: If the button with
the up arrow is pressed then
4 drive forward at full speed.
5
6 Line 5-7: Else, if the button
with the left arrow is
pressed then turn left at full
7 speed.
8
9 Line 8-10: Else, if the button
with the right arrow is
10 pressed then turn right at
full speed.
11
12 Line 11-13: Else, if the
button with the down
13 arrow is pressed then drive
backward at full speed.
14
15 Line 15: If none of the
buttons are pressed then
stop the robot (turn the
motors off).
For this program just make sure the light commands are in the correct if statement. It doesn’t matter
if the LED command is before or after the movement block just as long as it’s in the correct part of
the if statement. You need to add the red command after the motor has been turned to 0, the yellow
command for the left light in the left arrow if statement, the yellow command for the right light in
the right arrow if statement and the white command in the reverse if statement. A block the students
will probably forget is to turn the lights off when the forward button is pressed. The only time when
no lights should be on is when the robot is moving forward.
For the reverse warning, we need to add the tone block to the reverse if statement. For the horn, we
need to add an extra if statement. It doesn’t matter if we add this at the start or end of the loop.
However, it should be an extra if statement and not added to the existing one, as the condition for
sounding a horn doesn’t rely on or affect what the motors are currently doing.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
Line 18: Play the reverse
20 warning sound.
21
Part 4. Using the ultrasonic sensor and buzzer to create a proximity sensor
To make the proximity sensor, we need to add an additional if-else statement. We have added it
at the end of the loop. The students can use the same or different notes for each distance. The
important part is to make the beat shorter the closer the robot gets to the obstacle so that the
proximity can be roughly guessed just from the sounds the robot is making. In the sample solution,
the beat has been halved each time from 30cm to 20cm to 10cm so we can tell how close we are to
the obstacle by how fast the robot is beeping.
1
2
3
4
5
Lines 24 - 31: This is the
6 additional code for the
proximity sensor.
7
8
9
...
Lines 24-25: If the
distance from the
obstacle is less than
23 10cm, play C5 for an
eighth of a note.
24
25
26 Lines 26-28: Else if the
distance is less than
27 20cm, then play B4 for
a quarter of a note.
28
29
30
31
Lesson Overview 6
5
Teacher Guidance 6
5
Student
Answer Sheet
Sheet5a:
6b:Safety
Smart and
city challenges
signalling
Answer
Subject Sheet
Update:
6b:Tips
Smart
forcity
teaching
challenges
coding
60 minutes
Computing In the sixth lesson of this unit of work, your class will learn about
• Understand the hardware and some of the ways in which technology has failed in the past, and
software components that
make up computer systems,
how engineers have worked to overcome those problems. They
and how they communicate will then will look back over the hardware and software they have
with one another and with other been using in the past five lessons and combine these skills to
systems complete challenges. This lesson should be used as an opportunity
• Understand simple Boolean to consolidate learning, revisit any shaky territory and experiment
logic and some of its uses in
circuits and programming
with combinations of inputs, outputs and different ways of coding
• Use a programming language the mBot.
to solve a variety of
computational problems
Lesson steps Learning outcomes
Step
4
10
mins
mins
30
Step
Motor Line follower
Encounter Edu
The mBot has two motors, each The mBot has a line follower,
controlling one of the large which uses two LEDs and
rear wheels. By controlling the receivers to sense between
What other types of motors might you use in robotics? How might you use the line follower for a smart city scenario?
STUDENT SHEET 6a: Hardware and function cards
...................................................................................................................... ......................................................................................................................
Is run forward the only option for the movement block? If not, Can you remember the different values for the outputs, e.g.
what other choices do you have? black & black = 0?
...................................................................................................................... ......................................................................................................................
If you need any more help with using the motor, have a look at If you need any more help with using the line follower, have a
the Lesson 2 activities again. look at the Lesson 3 activities again.
...................................................................................................................... ......................................................................................................................
Using black tape, create a course for your mBot to complete. Use a course you have already created or make a new one
Try not to put the lines too close together and if you have a using black tape. Place some obstacles on your course such
cross roads or T-junction think about how you might be able to as pedestrians, road works or fallen trees. Make the mBot stop
code the mBot. when it encounters an obstacle, and then start moving again
once they are removed.
You could also decorate your city by creating model
STUDENT SHEET 6b: Smart city challenges
What sensor will you use? What sensor will you use?
...................................................................................................................... ......................................................................................................................
What blocks will you need to use? What blocks will you need to use?
...................................................................................................................... ......................................................................................................................
...................................................................................................................... ......................................................................................................................
71
TEACHER GUIDANCE 6 (page 2 of 3)
slide 5.
this stage.
going wrong?
for home leraning.
Step
Encounter Edu
The mBot has two motors, each
73
Here’s an example of Here’s an example of
some code to move some code to tell the
forward for 3 seconds and robot to move forward
then stop: if both sensors detected
black:
What other types of motors might you use in robotics? How might you use the line follower for a smart city scenario?
STUDENT SHEET 6a: Hardware and function cards
...................................................................................................................... ......................................................................................................................
Is run forward the only option for the movement block? If not, Can you remember the different values for the outputs, e.g.
what other choices do you have? black & black = 0?
...................................................................................................................... ......................................................................................................................
If you need any more help with using the motor, have a look at If you need any more help with using the line follower, have a
the Lesson 2 activities again. look at the Lesson 3 activities again.
...................................................................................................................... ......................................................................................................................
Encounter Edu
The ultrasonic sensor measures The mBot can be controlled
74
Here’s an example of
some code to tell the Here’s an example of
robot to stop if it detects some code to tell the
an obstacle: mBot to move forward if
the up arrow is pressed:
How might you use the ultrasonic sensor for a smart city
scenario?
STUDENT SHEET 6a: Hardware and function cards
......................................................................................................................
Can you code the mBot to do something else instead of Can you think of other ways in which you would like to use the
stopping for an obstacle? remote control?
...................................................................................................................... ......................................................................................................................
If you need any more help with using the ultrasonic sensor, have If you need any more help with using the remote control, have a
a look at the Lesson 4 activities again. look at the Lesson 5 activities again.
......................................................................................................................
The mBot has LEDS, and each The mBot has a buzzer which
Encounter Edu
Code Smart 11-14
set has three separate colours. can be coded to make different
This means that the overall sounds for different events.
colour of the LEDs can be
controlled by combining the
strength of each colour.
75
some code to tell the some code to tell the
mBot to show red lights mBot to play a sound
when the mBot is not when the A button on the
moving: remote control is pressed:
STUDENT SHEET 6a: Hardware and function cards
Can you think of other ways to use the LEDs? Can you think of other ways to use the buzzer?
...................................................................................................................... ......................................................................................................................
What kind of smart city scenarios would need you to use lights? What kind of smart city scenarios would need you to use the buzzer?
...................................................................................................................... ......................................................................................................................
If you need any more help with using the LEDs, have a look at the If you need any more help with using the buzzer, have a look at the
Lesson 5 activities again. Lesson 5 activities again.
Using black tape, create a course for your mBot to complete. Use a course you have already created or make a new one
Try not to put the lines too close together and if you have a using black tape. Place some obstacles on your course such
cross roads or T-junction think about how you might be able to as pedestrians, road works or fallen trees. Make the mBot stop
code the mBot. when it encounters an obstacle, and then start moving again
once they are removed.
You could also decorate your city by creating model
STUDENT SHEET 6b: Smart city challenges
What sensor will you use? What sensor will you use?
...................................................................................................................... ......................................................................................................................
What blocks will you need to use? What blocks will you need to use?
...................................................................................................................... ......................................................................................................................
...................................................................................................................... ......................................................................................................................
Escape from the dead end Make the city wondrous
This could be part of the course made for a city tour or a This could be part of the course made for a city tour or a
separate problem. Consider what the mBot should do after it separate problem.
STUDENT SHEET 6b: Smart city challenges
What sensor will you use? What sensor will you use?
...................................................................................................................... ......................................................................................................................
What blocks will you need to use? What blocks will you need to use?
...................................................................................................................... ......................................................................................................................
...................................................................................................................... ......................................................................................................................
Drive around a fallen tree Drive around a fallen tree and continue touring
What sensor will you use? What sensor will you use?
...................................................................................................................... ......................................................................................................................
What blocks will you need to use? What blocks will you need to use?
...................................................................................................................... ......................................................................................................................
...................................................................................................................... ......................................................................................................................
ANSWER SHEET 6b Code solutions can be downloaded from:
encounteredu.com/teachers/units/code-smart-11-14
This program is just the same as the first one in Lesson 3. When students create the course just make
sure the black tape has clear white sections between it so the robot doesn’t get confused.
This is just the line following program with an extra condition added in for the ultrasonic sensor
telling it to stop when an obstacle is detected. As soon as the obstacle is removed the robot will start
moving again.
1
2
3 Lines 3-4: If the ultrasonic
sensor detects an obstacle
less than 10cm away then
4 stop the robot.
5
6
7
8 Lines 5-12: Else run the line
9 following code
10
11
12
This program isn’t as complex as it sounds. It’s just the same as the program from lesson 4 for making
the robot react to avoid an obstacle. The only difference is the wait block after the turn. This is
because the ultrasonic sensor can only detect obstacles directly in front of it and this tells the robot
to continue to turn for another second. This ensures that the robot completes the turn to avoid the
obstacle, as otherwise it will start driving forward as soon as the obstacle clears the sensor. Make
sure the students don’t cheat by just making the robot turn before it drives into the dead end.
1
2
3
4 Line 5: Make the robot turn
for longer so it manages to
5 get out the dead end
6
7
This challenge is a very open-ended opportunity to add lights and sounds. Students might build
programs like those from lesson 5 in which lights and sounds are produced when certain remote
buttons are pushed. Below is a simple example of adding lights and music when the A button is
pushed. Students can add LED and buzzer blocks to any of the other programs they’ve written.
1
2
3
4
5 Line 4-7: Make the robot
LEDs turn red and then blue
6
7
8
Line 8-9: Make the robot
9 produce music
For this example program answer, the robot took 3 seconds at engine speed 100, to drive the length
of the obstacle. The students will have to determine the time it takes for the robot to drive the length
and width of the obstacle they use. The program assumes the robot drives towards middle of the
obstacle so to navigate around it, we turn left and drive forward for 2 seconds to be sure we clear
the length of the obstacle. Then we turn right and drive forward for 3 seconds to ensure we clear the
width of the end (you can measure this as well). Finally, we turn right and drive forward for 2 seconds
and turn left again. We should be roughly on the path we would be on if the obstacle weren’t in the
way. Remember the amount of time you turn for depends on the floor surface and engine speed,
and you will have to experiment with this. This program makes the robot go around the left side of
the obstacle but it would also work to drive around the right side of the obstacle.
15
Lines 13-16: Turn right and
16 drive to roughly the other
side of the tree from where
17 the robot detected the tree.
18
Lines 17-18: Turn left to be
in position to keep driving
along the original route.
This program combines the program to drive around a known obstacle, the fallen tree example
above, with the line following program. There are only a couple of differences: 1) we remove the first
block to turn on the engine as the line following section of the program will do this and 2) the final
drive forward doesn’t need a wait block as it will keep running forward until it finds the black line
again.
1
2
3
4
5
6
7
8
9
10
11
Line 3-16: If the ultrasonic
12 sensor detects a fallen tree
less than 10 cm away, then
13 drive around the tree.
14
Line 15: Keep driving until
15 the robot finds the path
again.
16
17 Line 17-18: If-else block.
If both sensors can see
18 the black line then drive
forward.
19
Line 19-23: Else another
20 if-else block which adjusts
the robot’s direction so
21 both sensors are on the line
again. If only sensor 1 can
22 see the black line then we
have to turn slightly to the
23 left till both sensors can
see the black line again.
Otherwise, we are too far
to the left and have to
turn right to fix the robot’s
direction.
In part one of the workshop, your class will use personas to empathise with
different types of people. They will then use these insights to brainstorm
ways that robots and autonomous vehicles can improve lives or solve
problems.
Parts two and three of the workshop see student groups select, refine and
prototype ideas before presenting and demonstrating their proposals.
Lesson Overview 7
Teacher Guidance 7
Designing our
Age 11-14
(Key Stage 3)
Design & Technology In the last section of this unit of work, your class will take part in a
• Use research and exploration Design Thinking Workshop that can be delivered as three one hour
to identify and understand user
needs
sessions or combined as a half day activity.
• Develop specifications to
inform the design of innovative, In part one of the workshop, your class will use personas to
functional, appealing products empathise with different types of people. They will then use these
that respond to needs in a insights to brainstorm ways that robots and autonomous vehicles
variety of situations
• Use a variety of approaches to
can improve lives or solve problems.
generate creative ideas and
avoid stereotypical responses Parts two and three of the workshop see student groups select,
refine and prototype ideas before presenting and demonstrating
their proposals.
Resources
Lesson steps Learning outcomes
Slideshow 7:
1. Video opener (5 mins)
Designing our smart city
Introduction to design thinking. • Understand that design is a process
pt. 1
This workshop’s challenge is to use • Name at least one job associated
Student Sheet 7a: design to solve the problems of with design
User profiles citizens living in a city of the future.
Student Sheet 7b:
Empathy map 2. Classroom discussion (10 mins)
Students will discuss the video and • Describe basic design thinking
Video: share ways they’ve solved problems techniques
Design thinking in that past.
Subject Updates:
• How can autonomous 3. Design thinking: empathise
vehicles be useful? (15 mins)
• Futures Thinking Students will use character • Understand that issues affect
personas to empathise with people in different ways
different people and their travel • Empathise with different people and
Kit (per group) related problems. describe how they might see the
world
• No additional kit required 4. Design thinking: ideate (20 mins)
Students will complete a class • Think creatively to generate
brainstorming activity then work solutions to problems
in groups to brainstorm ideas to
make life better for the character
personas.
Step
Step
Occupation:
Part-time cashier. Rowan is also the
primary carer for two young children.
Occupation:
Retired chemist one or more user profiles, so they can empathise
with how different types of people would use or be
Where does she live?
Where does he live? In the centre of town.
On the outskirts of town where rent is
cheap. How does she currently travel?
design.
eyesight isn’t perfect any more.
How often does he travel?
Four journeys a day: to and from How often does she travel?
school to drop off the kids, then to and Rosemary takes around five journeys a
from work. week.
What is most important to him about What is most important to her about
his mode of transport? her mode of transport?
Convenience, speed and cost. Cost and enjoyment.
Step
User profiles
Name: Name:
Rowan Berry Rosemary
Sultana
Age: Age:
36 72
Occupation: Occupation:
Part-time cashier. Rowan is also the Retired chemist
primary carer for two young children. Where does she live?
Where does he live? In the centre of town.
On the outskirts of town where rent is
cheap. How does she currently travel?
How does he currently travel? Mostly by bus. It’s free with her Senior
Mostly by car. It’s expensive to keep Citizen Card and she likes looking out
a car but the bus is infrequent and the window. She worries about getting
inconvenient. hit by cars crossing the road as her
eyesight isn’t perfect any more.
How often does he travel?
Four journeys a day: to and from How often does she travel?
school to drop off the kids, then to and Rosemary takes around five journeys a
from work. week.
What is most important to him about What is most important to her about
his mode of transport? her mode of transport?
Convenience, speed and cost. Cost and enjoyment.
User profiles
Name: Name:
Ash Quince Holly Apple
Age: Age:
19 49
Occupation: Occupation:
Full-time student Marketing manager
Where does he live? Where does she live?
On campus at university, near the In the suburbs, but she works in the
town centre. town centre.
How does he currently travel? How does she currently travel?
Ash is blind. He mainly travels on By car at least twice a day on work
foot with his guide dog. His parents days. She feels guilty about the
sometimes drive him but he’d prefer to environmental cost and hates traffic
be self-sufficient. but she can’t find a good alternative.
How often does he travel? How often does she travel?
Several short journeys a day, plus a At least twice a day on work days plus
long journey once a month. she likes to drive in the country at the
What is most important to him about weekend.
his mode of transport? What is most important to her about
Cost, safety and independence. her mode of transport?
Convenience and comfort.
User profiles
Name: Name:
Laurel Plum
Age:
Age:
27
Occupation: Occupation:
Delivery driver
Where does she live? Where does this person live?
In a flatshare near the town centre.
How does she currently travel?
She travels by a company-owned van.
It often breaks down, which annoys How do they currently travel?
her.
How often does she travel?
All day every day for work. She likes to
go for cycle rides on her days off. How often do they travel?
What is most important to her about
her mode of transport?
Reliability, comfort and speed. What is most important to them
about their mode of transport?
Empathy map
Lesson Overview 8
Teacher Guidance 8
Video: Prototyping
Designing our
Age 11-14
(Key Stage 3)
Computing Part two of the workshop sees your class use an ideas funnel to
• Use a programming language select and refine ideas from the brainstorming activity in part
to solve a variety of
computational problems
one. Each group will then prototype one of the ideas using the
hardware and software skills they have learned with the mBot in
Design & Technology lessons 1-6.
• Test, evaluate and refine
ideas and products against
a specification, taking into
Lesson steps Learning outcomes
account the views of intended
users and other interested 1. Video opener (5 mins)
groups Introduction to prototyping, • Understand what prototyping is
• Develop and communicate including examples of different and why it is used
design ideas using annotated ways people prototype and why it
sketches, oral and digital is useful.
presentations and computer-
based tools 2. Classroom discussion (10 mins)
• Apply computing and The class will discuss the video • Describe a number of prototyping
use electronics to embed and think about how they can use methods
intelligence in products that prototyping to take some of their
respond to inputs, and control ideas forward.
outputs, using programmable
components 3. Design thinking: prototype
(35 mins)
Groups will use an ideas funnel • Evaluate and refine own ideas and
to choose one idea from the the ideas of others
brainstorming session and use the • Combine hardware, software and
mBot and crafting skills to create crafting skills to make a prototype
Resources
a working demonstration that they
will be able to present to the class in
Slideshow 8: the next part of the workshop.
Designing our smart city
pt. 2 4. Reflect (10 mins)
Student Sheet 8a: Students will reflect on the • Share learning with the class
Ideas funnel prototyping activity and share their
learning.
Video:
Prototyping
Subject Updates:
• Futures thinking
Step
Step
Ideas funnel
Ideas
Ideas
Ideas
Why?
Lesson Overview 9
Teacher Guidance 9
Designing our
Age 11-14
(Key Stage 3)
Design & Technology In part three of the workshop each group will discuss different
• Use research and exploration ways of sharing ideas then create articles, posters, videos,
to identify and understand user photo galleries or reports to persuade their audience that their
needs
• Test, evaluate and refine prototypes are worth taking forward. Then each group will present
ideas and products against their prototypes and demonstrate their ideas in action using the
a specification, taking into mBot as part of a working display.
account the views of intended
users and other interested
groups Lesson steps Learning outcomes
• Develop and communicate
design ideas using annotated 1. Video opener (5 mins)
sketches, oral and digital Communicating ideas, introducing • Understand why sharing ideas is
presentations and computer- the ideas of different audiences important
based tools and different methods of • Name at least one job associated
• Apply computing and communication. with communication
use electronics to embed
intelligence in products that 2. Classroom discussion (10 mins)
respond to inputs, and control The class will discuss the video • Identify a variety of different media
outputs, using programmable and link different methods of and describe when each might be
components communication with the audiences used
they might reach.
Step
Step
STUDENT SHEET 9a
Hand out Student Sheet 9a Communicating your
Communicating
your ideas ideas.
Compare different ways to communicate.
Blog post
Billboard
audience (who it reaches), reach (how many
people it reaches), tone of different methods of
Advert in
The Guardian
Column on the
Mail Online
Snapchat
communication, then think about the best ways of
communicating with their target audience.
Leaflets
Direct email
Posters
reaching their target audience.
Communicating
your ideas
Compare different ways to communicate.
Blog post
Billboard
Advert in
The Guardian
Column on the
Mail Online
Snapchat
Leaflets
Direct email
TV advert
Word of mouth
Posters
Did you know? The Code Smart education programme challenges students to
design solutions for the smart city. The smart city is a concept
Today, over half rooted in innovation, information and infrastructure. It points to
issues of sustainability and how an increasingly urban population
of the world’s can enjoy a decent future.
population lives in
urban areas, and this The smart city
number will increase Today, over half of the world’s population lives in urban areas, and this number
to about two thirds will increase to about two thirds of the world’s population by 2050. That means
that today, 3.9 billion people are living in cities, and this number will continue to
of the world’s rise exponentially.
population by 2050. A successful and sustainable future will need successful and sustainable cities.
This is the heart of UN Sustainable Development Goal 11: to make cities and
human settlements inclusive, safe, resilient and sustainable. Smart cities are a
way of making life better for the billions of city dwellers around the world.
Learn more about SDG 11 The ideal city would be cleaner, quieter, safer, more accessible and healthier.
Cleaner cities would have less trash and less pollution than current systems.
Quieter cities would have less noise from cars and a more organised sense of the
urban chaos that city dwellers love. Safer cities would be well lit, well patrolled,
and have a strong sense of community. Accessible cities would make public
transportation the cheapest and easiest option for travel for all people, negating
the need for cars and the congestion and pollution they bring. Bike access and
priority is also a cornerstone of the ideal sustainable city. Finally, the ideal city
makes health of individuals and the city a priority. This means people have
easy access to fresh food, recreation, and healthcare and there are top-notch
sanitation systems, like great sewer and wastes services.
Energy conservation Also in the transportation arena, smart traffic management is used to monitor
and analyse traffic flows to prevent roads from becoming too congested.
and efficiency are Such smart traffic management systems would be easier to integrate with
autonomous vehicles. Smart public transit is able to coordinate services and fulfil
major focuses of traveller needs in real time, improving efficiency and satisfaction. Ride-sharing
and bike-sharing are also common services in a smart city.
smart cities. Using
smart sensors, smart Energy
street lights dim Energy conservation and efficiency are major focuses of smart cities. Using
smart sensors, smart street lights dim when there aren’t cars or pedestrians.
when there aren’t Smart grid technology can be used to improve operations, maintenance and
cars or pedestrians. planning more generally to supply power on demand and monitor energy
outages. Smart city energy initiatives also aim to monitor and address
environmental concerns such as air pollution and climate change that result
from energy consumption.
Waste
Sanitation can also be improved with smart technology. One example is using
internet-connected bins and a connected fleet of rubbish lorries to prioritise
areas of need rather than driving on a pre-defined schedule. Sensor technology
could also be used to test the quality of drinking water or spot potential sewage
blockages before they become an issue.
Safety
Smart city technology is increasingly being used to improve public safety, from
monitoring areas of high crime to improving emergency preparedness with
sensors. For example, smart sensors can be critical components of an early
warning system before droughts, floods, landslides or hurricanes.
Buildings
Smart buildings are also often part of a smart city project. Old buildings can
be retrofitted, and new buildings constructed with sensors to provide real-time
building management and ensure public safety. Attaching sensors to buildings
and other structures can detect wear and tear and notify officials when repairs
are needed. Citizens can also help in this matter, notifying officials through a
smart city app when repairs are needed in buildings and public infrastructure,
such as potholes in roads. Sensors can also be used to detect leaks in water
mains and other pipe systems, helping reduce costs and improve efficiency of
public workers.
Did you know? Though self-driving cars used to be science fiction fantasies, they
are now likely to be part of our not-too-distant future and have
Researchers the potential to re-shape our society similar to the impact of
agriculture, the industrial revolution, or the internet. Autonomous
estimated that vehicles could change how we approach ownership and reshape
there are at least our cities, improve welfare and accessibility and increase safety.
700 million parking Shift from individual car ownership to shared ownership
spaces in the US, Despite existing car sharing solutions, such as taxis, carpooling, and car rentals,
which is more than many adult individuals currently own a car. In 2015, there were an estimated 263
million cars in the United States. That’s almost 1 car for every adult. Additionally,
6,000 square miles in 2017 Americans spent nearly $2 trillion on car related costs (including fuel,
maintenance, and insurance), more than they spent on food that year.
Unfortunately, the average vehicle is only used 4% of the time and remains
parked for the other 96%. Autonomous cars, enable vehicles to be used more
efficiently, by allowing them to be continuously used over the course of a day to
transport multiple groups of people. For example, in the morning an autonomous
Re-imagine parking vehicle could be used to take students to school and adults to work. Afterwards,
that autonomous vehicle could be used to drive tourists to landmarks, take
the elderly to the doctors, and help people run errands. Then in the evening
the autonomous car can help students and adults return home. Because these
vehicles will be in constant use, there will be less use for parking spaces. In the US
there is an estimated 700 million parking spaces, which take up more than 6,000
square miles, that could be transformed into parks, shops, and housing.
More efficient use of vehicles will also decrease the total number of cars needed.
Fewer vehicles also mean less traffic, especially since autonomous vehicles can
communicate with each other. This enables them to driver faster and closer
together, which also allows people to get to their destinations faster.
Since these vehicles are shared, it is likely that most autonomous vehicles, will
not be owned by people, but instead owned by business who rent to individuals
on a pay-as-you-go basis. Experts estimate that this will still be cheaper for most
individuals since they no longer have to buy, fuel or repair their own cars.
Autonomous vehicles will also increase accessibility of travel for people who
otherwise could not drive, such as people with epilepsy, people with poor vision,
or others with impairments.
According to the Currently, a leading cause of preventable deaths is road accidents. According to
the World Health Organization, approximately 1.25 million pedestrian, cyclists,
World Health motorcyclists, and car occupants were killed by traffic deaths in 2013. Driverless
vehicles can minimize deaths and injuries due to human errors such as distracted
Organization, there driving and aggressive driving, saving millions of lives and billions of dollars in
damages and healthcare costs annually.
were approximately
1.25 million road So is it all good?
traffic deaths in It is likely that many people will lose their jobs as autonomous vehicles
become more mainstream. This includes drivers as well as people involved in
2013. the production of cars. In the US, this is an estimated 4 million driver jobs and
another 3 million jobs in the production and sales of cars.
In the past, whenever technology has displaced jobs, new jobs and industries
have also been created, such as designing and maintaining the autonomous
vehicles, both hardware and software, but the transition for workers who lose
their jobs can be difficult.
Timing
Before deciding what to do, an autonomous car will have to determine what’s
around it. Just like you wouldn’t want to walk straight into a wall or trip over a
rock, autonomous vehicles will need a way to identify and track obstacles in their
environment. This will mean a system of sensors and algorithms must be in place
to recognise pedestrians, other vehicles, buildings, and more. This is especially
important for keeping everyone safe, inside and outside of the vehicles.
What do I do next?
With all that information coming in, it can be challenging to decide what to do
next. Humans make decisions on how to reach their goals based on different
priorities. Autonomous vehicle will need to determine what to do next in order
to transport you to your destination based on various algorithms. The intelligent
software systems within autonomous vehicles will use the information about
where they are and what’s around them and calculate a safe and efficient route
Autonomous vehicles need to be to transport people to their destinations.
aware of surroundings incuding
pedestrians.
Did you know? The terms autonomous car, driverless car, self-driving car
and robot car have entered our language and are often used
Many cars already interchangeably, but when can a car actually be considered
autonomous? Broadly, an autonomous car is capable of sensing
have some driver its environment and navigating without human input.
assistance, such
As the scope and interest in autonomous cars has grown, a group
as cruise control or of engineers and technical experts has produced a guide to the
parking assistance. levels of automation in vehicles of the future: SAE International’s
J3016 standard, also known as the Taxonomy and Definitions for
Terms Related to On-Road Motor Vehicle Automated Driving
Systems. The guide describes six levels of autonomy, from no
Hands-off driving autonomy at Level 0, to full autonomy at Level 5.
Level 0: No Automation
Exactly as described. The car is not automated in any way and relies on a human
for all of the tasks. Even if warning systems flash, a human is still in full control.
The human driver is still expected to do the majority of the work, but is assisted
by automated systems. For example, cruise control sets the speed of the vehicle,
while the human driver steers. Parking assistance can automate steering, with
the human driver controlling the speed. The driver must be ready to retake full
control at any time.
Also known as “eyes-off”, at level 3, the driver does not need to monitor the
driving environment, but can turn their attention to other activities like sending
Full automation
an email or watching a film. The vehicle will handle safety-critical functions like
emergency braking, but can still call on the driver to take control.
means full-time
performance by
Vehicles in level 3 and above are considered “automated driving systems”. The
substantial difference here is that the vehicles are able to monitor the driving an automated
environment around them. Crucially, these types of vehicles make decisions
themselves. For instance, a level 3 car will be capable of seeing a slower moving driving system for all
vehicle in front of it before making the decision to overtake. The human is on
hand, mostly, to intervene if things go wrong.
aspects of driving
and under all the
Level 4: High automation
conditions
At Level 4, the vehicle is fully autonomous within certain driving scenarios. This
means that the driver could go to sleep or leave the driver’s seat and the vehicle
is able to handle all aspects of the driving task. However, this functionality is
limited to what is known as the ‘operational drive domain’ of the vehicle. For
example, a vehicle might only be fully autonomous in traffic jams or within a
limited geographic area.
Did you know? The mBot comes with a default program uploaded so you can
start playing with your mBot as soon as it is built. This default
You can always program has three modes, which allow you to explore different
functionalities.
return the mBot to
The three default modes
the default program,
even after you have Mode A – Remote control driving You can use the arrow buttons on the remote
to drive the robot forward and backward as well as turn left or right. The number
uploaded other buttons correspond to different sounds the mBot can make using its buzzer. In
programs. mode A, the mBot’s LEDs are white.
Mode B – Obstacle avoidance The mBot drives forward on its own until it detects
an obstacle, such as a wall. If it detects an obstacle, it turns then continues
driving forward. In mode B, the mBot’s LEDs are green.
Mode C – Line following The mBot will drive forward while following a black line.
In mode C, the mBot’s LEDs are blue.
The mBot starts off in mode A, and there are two ways to switch between the
modes:
1. Select the mode by pressing the A, B, or C button on the remote.
2. Press the onboard button on the mBot (located at the front end of the
mCore).
You can always return the mBot to the default program, even after you have
uploaded other programs. This may be helpful if you want to use the mBot with a
new group of students or if you want to test that the mBot is working.
1. Connect the mBot to the computer using the USB cable and turn on
power to the mBot.
2. Open mBlock and connect to the mBot. Click the Connect menu ->
Serial Port and select the last port.
Students can drive the mBot as soon 3. Next, click the Connect menu -> Reset Default Program. The default
as it is built. program will upload to the mBot.
4. When the upload is finished, you can disconnect the mBot and use the
modes of the default program.
Did you know? In order to program your mBot, you will need mBlock on your
computer. mBlock is a graphical programming environment
The mBlock developed by Makeblock based on Scratch 2.0 Open Source
Code. The mBlock language is easy to learn and allows you to
language is easy to code your mBot and other Arduino based hardware quickly.
learn and allows you
This Subject Update provides you with the instructions to get
to code your mBot mBlock set up on your computer before your first coding session in
and other Arduino Code Smart Lesson 2.
based hardware Download and install mBlock 3
quickly. Note: These instructions will help you to get started with mBlock on your
computer or laptop device. There is a different set of instructions for tablets
and mobile devices. You may need to contact your school’s network manager
to help you if your school system requires permission to install outside software.
They may also be helpful for getting the software on all of the computers your
mBlock
students will be using.
2. You may also need to download the Arduino Software (IDE) for your
mBlock can be downloaded for
operating system from https://www.arduino.cc/en/Guide/HomePage,
Windows, Mac OS, Chrome OS and
especially if you are using macOS.
Linux.
mBlock is based on 3. When you open mBlock, you should see a screen similar to this:
4. There are a couple of settings that you should check for in order to be ready
to code your mBot.
Go to the Boards menu and check that mBot (mCore) is selected. It may
have a check mark already next to it. If not, click mBot (mCore) to select.
mBlock FAQ, go to
http://mblock.cc/faq/
mBlock scripts are in 5. On the left side of your screen you should see all of the different ‘Scripts’
that you can use. You’ll be mostly using the blue Robots blocks, gold
bit-sized blocks that Control blocks, and green Operators blocks.
Uploading code
6. When you are ready to write your code, all you have to do is click a block,
drag and drop it to the script area in the middle of your screen. Note that
the Arduino code in the right panel automatically updates, which can be
helpful for demonstrating what typical typed code might look like.
7. As you add more blocks to your code, make sure they snap into place. You
can also rearrange blocks as much as you like. To delete a block, just drag it
We recommend
back into the script blocks section. Note: many blocks also have dropdown
menus within them to allow you to adjust specific commands. Please refer
putting the mBot
to the mBlock glossary Subject Update for brief descriptions of the blocks upside down while
used in Code Smart.
programs upload
because we wouldn’t
want the mBot to run
off the desk or table!
8. Connect the mBot to your computer with the USB cable provided.
9. Then switch ON the power to the mBot and set it upside down. We
recommend putting the mBot upside down because it immediately starts
running programs once they are uploaded and we wouldn’t want the mBot
to run off the desk or table!
Did you know? Quick start guide to uploading code to the mBot (continued)
While the code 10. Go to the Connect menu and enter the Serial Port submenu. Select the last
option. The last option should be something like ‘COM#’, where # is the
uploads, check out largest number in the list.
11. You should notice that your mBlock environment should update to say
‘Serial Port Connected’. Additionally, you should notice a green status dot in
the Scripts column next to mBot if you have the Robots block type selected.
Quick start guide to uploading code to the mBot (continued) Did you know?
Don’t close this window as it will give you a status update when uploading is
finished. (Note if the green dot next to mBot in the Scripts column turns red
that is okay.)
14. When the upload is finished you should see the small window update to look
like this:
15. Success! Your mBot should have your program uploaded. You may even
notice your mBot moving as soon as the upload is done. To let the mBot
move according to your program, disconnect the mBot, set it on the
ground, and toggle the power switch OFF and then ON again. Your mBot
should do what your program says.
mBlock app
Did you know? In addition to the desktop mBlock software for personal
computers and laptops, MakeBlock has released an app for iOS
MakeBlock has and Android so that you can use tablets and mobile devices to
control the mBot. The mBlock app is a game-based programming
released an app for app which challenges students to write progressively more
iOS and Android so difficult code. There is also the option to create code in a block-
based programming environment right in the app and then to
that you can use upload the program to the mBot via Bluetooth.
tablets and mobile
Downloading the mBlock Blockly app
devices to control
the mBot. You can download the mBlock Blockly app for iOS and Android from the
Makeblock website: https://www.makeblock.com/software/mblock-app/downloads
or from the App Store or Google Play store.
Upgrading Firmware
As additional features are added to Makeblock software and hardware, you may
need to upgrade firmware.
1. Connect the mBot to the computer using the USB cable and turn on power
to the mBot.
2. Open mBlock and connect to the mBot. Click the Connect menu -> Serial
Port and select the last port.
3. Next, click the Connect menu -> Upgrade Firmware. The firmware upgrade
will start automatically.
mBlock can be downloaded for iOS
and Android. 4. When the upgrade is finished, you can disconnect the mBot.
The mBlock app communicates with the mBot using Bluetooth. To connect your
device to your mBot:
1. Make sure Bluetooth is enabled on your device and that the mBot is
powered on.
2. The app may prompt you to connect or you may notice an ‘unlinked’ icon in
the top right corner of the app (if so, click on the icon).
You can use the
mBlock Blockly
app even without
internet connection.
Connecting mBot
3. The app will prompt you to bring the device close to the mBot. The device
and the mBot should link automatically.
If the device and the mBot do not link automatically, check that the mBot is
turned on and that Bluetooth has been enabled on the device. If this does not
work, then try upgrading the firmware.
There are two primary features of the app that may be of interest:
1) game-based coding challenges
2) programming environment
You can code on There are step-by-step instructions that will guide you through progressively more
difficult challenges and will show you more features of what the mBot can do.
your mobile device
in the mBlock Blocky
programming
environment.
2) Programming environment
To access the in-app programming environment, click the ‘create’ icon.
The programming environment in the app should look familiar if you’ve used
mBlock or Scratch. However, be aware that it is a little different. Because it is
block-based coding, you’ll likely be able to jump in by reading carefully and
learning through trial and error. The game-based coding challenges are a good
way to learn more about this in-app programming environment.
Did you know? You don’t have to be an expert programmer to teach robotics and
coding. The most important skill you will need as a teacher is the
The most important ability to help your class solve their own problems.
skill you will need However, it is helpful to have some tips for good practice that
as a teacher is the will help guide you and your students. In this Subject Update,
you’ll find tips about writing, editing and sharing code. You may
ability to help your also find the Troubleshooting guide Subject Update useful. It has
class solve their own solutions to the most common errors you’ll come across in your
robotics education journey. If you don’t find the solution to your
problems. problem in these Subject Updates, there is also a helpful forum of
mBot users online http://forum.makeblock.com/.
In programming there are often multiple ways to solve a problem. For example, if
you want to make your mBot keep driving forward if there is not an obstacle right
in front of it, but stop when there is one, you might write a piece of code like this:
Elegant code The two programs both solve the same problem, but approach it slightly
differently. In that case, both solutions are fairly similar in logic and length.
should be clean, However, sometimes there is a ‘better’ solution. For example, when making the
car drive in a square, you might repeat steps to drive and turn multiple times like
concise and easy to this:
understand.
Or like this:
Both solutions work, but the second solution would be considered ‘more elegant’
Some coders like to talk about writing ‘elegant code’. But what does that mean?
When you think of elegance in the literary world, you may think of long, beautiful
sentences and descriptive paragraphs. In coding, it’s different. It’s more
important to be clear and concise, like a straightforward recipe. Elegant code
should be clean, concise and easy to understand.
Find out more in the Coding tips: writing elegant code Subject Update.
It is rare to write a working program the first time you try. Do not worry about
failing. To fail is to make your First Attempt In Learning. Mistakes in your code
Code is written to
show that you are trying. solve a problem, and
Debugging is a systematic way for your students to iron out their mistakes. this is very rarely a
Instead of giving them the answers, you should encourage your students to work
through potential problems step-by-step, starting from basic issues and working problem for only one
up to more complex issues.
person.
Find out more in the Coding tips: failing and debugging Subject Update.
Comments
Programs can include comments which are ignored by the computer but are
meant for humans to read to help them understand the program. When you write
code it is helpful to add comments to it to let other people know what it does or
remind you what it does when you’ve not looked at in a while.
Sharing code
Code should be shared so we can learn from others and so we can work together
to write better code. Code is written to solve a problem, and this is very rarely
only a problem for one person. By sharing your code, you are helping others solve
their problems, and, by letting them see the actual code, they can understand
how it works and apply it to solve other similar problems.
By working together in this way, we can solve bigger programs rather than having
everyone individually solve the same smaller problems repeatedly and making no
progress. Computer software where the code is available for anyone to use and
see is called open source. The largest open source community in the world can
be found on github.com.
Coding tips:
writing elegant code
Did you know? Some coders like to talk about writing ‘elegant code’. But what
does that mean? When you think of elegance in the literary
When you write code world, you may think of long, beautiful sentences and descriptive
paragraphs. In coding, it’s different. It’s more important to be
for your computer, clear and concise, like a straightforward recipe. Elegant code
it’s like writing a should be clean, concise and easy to understand.
recipe and you need What is elegant code?
to be very clear. Remember that code is a set of instructions for your computer. When you write
code for your computer, it’s like writing a recipe and you need to be very clear.
If you had to follow the instructions on a messy, long and confusing recipe for
making a cake, you might miss an important step, need extra time, or even give
up on completing the cake!
Clear instructions are
important Elegant code should be clean, concise and easy to understand. It’s not about
showing how clever you are or how complicated you can make your code, it’s
about finding the simplest way possible of making your idea work. As the famous
writer Antoine de Saint-Exupery said, “perfection is achieved, not when there is
nothing more to add, but when there is nothing left to take away.”
Look at the two examples of code to make a car drive in a square on the next
page. Both programs make the car do the exact same thing, but which solution
would be easier to type out and faster to explain?
The second piece of code is easier to type and would be quicker to explain. Also,
when you need to drive in a pentagon or a hexagon in the future, it will be much
easier to modify the code in the second example.
There are many cases, where multiple solutions might achieve the same goal, but
one program may be a lot more efficient. For example, if a driverless car is using
a camera to read the traffic lights one program might constantly scan the whole
screen for traffic lights. However, another program might only scan at street
junctions and only scan the sides of the screen because that is where we know
traffic lights will be. Both programs would have the same result, but the second
one is more efficient and can save time, memory, and energy. You might not
think efficiency is a big deal, but when considering programs for driverless cars,
every second can count. Having an efficient program could be the difference
between stopping in time and crashing.
Coding tips:
failing and debugging
Remember! Do not be scared to test code at any time. We cannot be sure our
code will work until we test it. Often it is easy to miss something
If we make no that might happen when it runs from just reading the code. If
something unexpected happens, it’s an opportunity to think about
mistakes, we may what went wrong and how to fix it.
not learn about all
Failing
the different ways to
tackle a problem and It is rare to write a working program the first time you try. Do not worry about
failing. To fail is to make your First Attempt In Learning. Mistakes in your code
may actually miss a show that you are trying.
better solution. When we fail, we have an opportunity to learn from fixing the problem and this
helps us to understand how things went wrong in the first place. If we make no
mistakes, we may not learn about all the different ways to tackle a problem and
may actually miss a better solution.
Robot fail If you or your students are feeling a little deflated about code not working,
remember to have a look at the Robot fail video.
Debugging
Debugging is a systematic way for your students to iron out their mistakes. It is
highly likely that your students will get something wrong and their code may not
work the first time. At this stage, it might be tempting to leap in and try to solve
the problem. Instead of giving them the answers, you should encourage your
students to work through potential problems step-by-step, starting from basic
issues and working up to more complex issues.
Coding tips:
commenting
Documentation is important
Comments are very important for sharing code, so that anyone reading, editing
or debugging the original code in the future is able to understand the ideas and
intent of the original author.
Troubleshooting guide
The robot drives opposite of what is expected, e.g. backward when you press the
mBot has many pieces
forward / up button.
The motors might be wired the wrong way around. To fix this, disconnect and
reconnect the motor wiring in the correct arrangement.
The robot does not respond correctly to the remote control (without additional
coding).
Check what mode the mBot is in. It should start off in mode A manual remote
control driving. To get back into mode A, press the A button and check that the
mBot’s LEDs are white. If the robot does not respond at all, check the batteries
and power. Finally you can try re-uploading the default program to the mBot
(see the Subject Update mBot’s default program).
This will ensure you give the robot time to stop. The distance you need to set
will be affected by the speed. If the robot is driving fast, you may need to make
the distance larger.
If the mBot needs to turn more (a sharper turn), try increasing the amount of Professionally-designed robots fail too.
time it is turning for first and then consider increasing the speed. Especially on
rougher surfaces, like carpet, you may need the mBot to turn for a longer time
or more quickly than you need on a smoother surface, like tile.
Robot keeps losing the black line when working with the line follower.
Black lines might be too close together or too narrow. Bad lighting can also
confuse the sensor.
The robot doesn’t respond as expected to the remote control when the students
program it.
Refer to Answer Sheet 5 from Code Smart to help you.
Make sure to tell The robot lights stay on after I’ve finished with them.
If you do not turn them off the mBot lights will stay on as long as the robot has
the robot to check power. You should include a block like the following to turn off and reset the
lights when mBot is driving forward.
for obstacles first,
before telling it the
line follower code.
I can’t hear the buzzer.
Check the note isn’t for too short a time or too high. Many people have trouble
hearing high frequency notes, which end in larger numbers.
mBlock glossary
Did you know? To help you on your journey, this glossary contains mBlock
script blocks that are used in Code Smart as well as some
Learning to program common words used in the coding world. Soon enough you’ll be
comfortable with all the new lingo!
a robot includes
How to use this glossary
coding and is a bit
like learning how to This glossary is divided into two primary sections: 1) mBlock script blocks and 2)
common words used by coders.
speak the language
of the robot. The mBlocks script blocks section is organized by block type and the blocks
appear in the same order as in the mBlock environment. There is an image and
description of the blocks used. Additionally, extra notes like what is in the drop-
down menus of each block are included for your convenience. The drop-down
menus in each block will be referred to by number, indicating its position within
the block left to right.
mBlock script types
The common words used by coders is organized alphabetically like a typical
glossary.
mBlock glossary
Robots These blocks either gather data from the robot as inputs or result in actions of the robot
as outputs.
Movement Description
Tells robot to move or turn
in a certain direction and
speed
-- Drop-down 1: run forward, run backward, turn right, turn left
-- Drop-down 2: speeds of 255, 100, 50, 0, -50, -100, -255
(negatives indicate movement in opposite direction)
Motor Description
Adjusts the individual
motor speeds; M1 should
be the left motor and M2
-- Drop-down 1: M1 and M2; referring to left and right motors should be the right motor
-- Drop-down 2: speeds of 255 100, 50, 0, -50, -100, -255
(negatives indicate movement in opposite direction)
LED Description
Adjusts the LED lights on
mBot
mBlock glossary
IR remote Description
Indicates a button on the
IR remote
mBlock glossary
Control These blocks control the code, including when and how long it runs.
Wait Description
Tells your robot to wait
a # of seconds before
moving on to the next line
# can be changed by typing of code
Forever Description
Makes whatever is inside
it repeat continuously
forever
If Description
These blocks
Makes whatever is inside allow you to
it happen IF the specific write conditional
condition is met statements. Whether
the instructions inside
You can put other blocks into the hexagon in the first line to the block will run
establish a condition to check for depends on whether
the condition in the
first line is met.
If — else Description
Makes whatever is inside
the first section happen IF
the condition is met; ELSE
whatever is in the second
section will happen
You can put other blocks into the hexagon in the first line to
establish a condition to check for
mBlock glossary
Operators These blocks are used to do mathematics and construct conditions used in Boolean
logic, such as ‘and’, ‘or’ and ‘not’.
Block - or Description
Checks if either condition
or both conditions are
true
mBlock glossary
rules that structure Computer – a machine designed to follow instructions and process data
the way they think. Computer logic – the basic rules that a computer follows
Futures Thinking
Did you know? The future will not be the same as the present and the future is not
fixed or confirmed. However, we have the ability to shape what
Futures Thinking is the future will be like through our decision making and actions
now. As the saying goes, ‘children are our future’, and education is
a cross-disciplinary all about preparing them for the future.
approach to
considering potential What is Futures Thinking?
futures through the Futures Thinking is a cross-disciplinary approach to considering potential futures
exploration of trends through the exploration of trends and drivers for change that may lead to
different future scenarios. This includes evaluating what scenarios are possible,
probable and preferable futures. This is not about predicting the future, but
rather critically considering the future, so that we can better make decisions
and take actions in the present.
Futures Thinking enables you to consider the major changes in the next 5, 10,
OECD Futures Thinking in 20 or more years in all areas of life, including social interaction, education and
Action technology. While the future cannot be reliably predicted, we can critically
consider the future, so that we, as individuals and groups, can be more
deliberate with actions, decisions and policies that may help promote desirable
futures and help prevent undesirable ones.
Why is it important?
Did you know? How to incorporate Futures Thinking into your classroom
Only by considering Futures Thinking is not limited to policy makers and corporate executives. It
can apply to all areas of life and be done by anyone. Thus, we encourage you to
the future can we engage in Futures Thinking with your classroom. In order to engage with Futures
Thinking, there are a few things that you must consider:
deliberately help
shape it. • Existing situation – What is happening now and why? Who benefits
and who loses?
• Trends – How does the existing situation compare to the past? Are
there patterns in the changes?
• Drivers – What is causing the changes? The causes might be specific
community perceptions, beliefs, values or attitudes. It might be that
other changes have caused ripple effects, such as demographic
changes, environmental damage, developments in science and
technology or changes in political policy.
• Possible futures – What might happen in the future?
• Probable futures – What is most likely to happen in the future? Which
trends and drivers are likely to persist?
• Preferable futures – What do you want to happen in the future? Why?
Who benefits and who loses?
You can incorporate these questions into discussions, journal entries, research
projects and presentations.
About Oxbotica
Autonomous vehicles Currently, vehicles on the road rely on the intelligence of humans to decide what
to do. So in order to have autonomous vehicles of the future, there must be an
can help make roads intelligent system that decides what the car should do. Oxbotica is developing
safer this.
Oxbotica has developed a system called Selenium that uses patented algorithms
to interpret the environment around the vehicle and to navigate the vehicle
safely and efficiently. Different sensors, that are mounted onto the vehicles,
take in data about their environment. Selenium then uses this information to
Oxbotica Geni make decisions about where to go next for example turning right, slowing down,
or stopping. What makes Oxbotica’s software solution special is its ability to
learn. Data from different vehicles is shared within Selenium, so that the system
actually becomes more intelligent over time.
Computers are better suited to receive large amounts of data and don’t
get distracted. Autonomous vehicles can help make roads safer and reduce
the amount of wasted time in traffic. They can also increase accessibility
of transportation for people who can’t currently drive because of physical
impairments like poor eyesight or epilepsy. More efficient driving enabled by
intelligent systems will also reduce carbon emissions and road wear and tear,
both beneficial for the environment.
Photo credits
Cover Autonomous vehicle: Oxbotica
Students: Start Dream Big
Student sheet 2a mBot: Makeblock
Student sheet 2b ‘DC motor’ by Dcaldero8983 on Wikimedia Commons is used under a Creative Commons Attribution-ShareAlike 3.0
Unported license <https://creativecommons.org/licenses/by-sa/3.0/deed.en>
‘Servo motor’ by oomlout on Wikimedia Commons is used under a Creative Commons Attribution-Share Alike 2.0
Generic license <https://creativecommons.org/licenses/by-sa/2.0/deed.en>
‘Stepper motor’ by oomlout on Wikimedia is used under a Creative Commons Attribution-Share Alike 2.0 Generic
license <https://creativecommons.org/licenses/by-sa/2.0/deed.en>
Student sheet 3a mBot exploded view diagram: Makeblock
Student Sheet 4a Lidar viewer: Oxbotica
mBot exploded view diagram: Makeblock
Student Sheet 6a LEDs: Pixabay, halejandropmartz
Student Sheet 6b mBot: Makeblock
Student Sheet 9a Bus shelter ad is a derivative of ‘Bus shelter’ by Albert Bridge on Geograph.ie used under a Creative Commons
Attribution-ShareAlike 2.0 Generic license <https://creativecommons.org/licenses/by-sa/2.0/deed.en>
SU How can autonomous vehicles be Parking: Pexels, Tuur Tisseghem
useful?
SU What questions does an Geni vehicle: Oxbotica
autonomous vehicle need to answer?
SU When can a car be considered Hands-free driving: Oxbotica
autonomous?
SU How to set up mBlock mBot cartoon: Makeblock
SU Coding tips: writing elegant code mBot cartoon: Makeblock
SU Coding tips: failing and debugging Robot fail: DARPA
SU Coding tips: commenting mBot cartoon: Makeblock
SU Troubleshooting guide mBot components: Makeblock
SU mBlock glossary mBot cartoon modified with bug from mBot cartoon by Makeblock
SU About Oxbotica Geni vehicle: Oxbotica
All other photos Encounter Edu
Code Smart is a new approach to teaching computing
to prepare students for the future. Computer scientists
and engineers are designing the next generation of
driverless cars and the careers and opportunities in this
space are expanding rapidly. Set against this real-world
context of autonomous mobility, Code Smart brings the
excitement of robotics, artificial intelligence and coding
into the classroom.
Where Encounter Edu designs and runs STEM and Global Citizenship education
programmes, which make use of virtual exchange, live broadcast and virtual
Learning reality. These technologies create classroom encounters that widen young
Meets people’s world view. Learning is further underpinned by an online library
The World of teacher resources and training. Combined, these provide children with
the experience and knowledge to develop as engaged citizens and critical
www.encounteredu.com thinkers for the 21st Century.