Simulink Rasp Pi Workshop
Simulink Rasp Pi Workshop
Workshop Manual
A brief workshop on Simulink Support for Project
Based Learning with Raspberry Pi
MathWorks, Inc.
4/14/2015
Page 1 of 40
Licensed under a Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) License
The licensor cannot revoke these freedoms as long as you follow the lice nse terms.
Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes
were made. You may do so in any reasonable manner, but not in any way that suggests the licensor
endorses you or your use.
ShareAlike — If you remix, transform, or build upon the material, you must distribute your
contributions under the same license as the original.
No additional restrictions — You may not apply legal terms or technological measures that legally
restrict others from doing anything the license permits.
Page 2 of 40
Simulink and Raspberry Pi Workshop Manual
Contents
Motivation for this Workshop 4
Getting Started 4
Introduction: How to Use and Work with this Manual 5
I1 Workshop Run-through Options 5
I2 Notation and Formatting 5
Getting Familiar with Simulink 6
P0.1 Simulink 6
P0.2 Simulink Library Browser and Simulink Model 6
P0.3 Simulation with Simulink 8
P0.4 More on Simulink 10
Project 1: Blink LED – First Application with Simulink and Raspberry Pi 11
P1.1 Blink the Small Green LED on the Raspberry Pi 11
P1.2 Build Your Own BlinkLED1 Model 13
Project 2: Object Detection with Video Processing 17
P2.1 Image Inversion: Simple Image Processing 17
P2.2 Object Detection: Simple Object Detection 19
P2.2.1 Part 1: Build the Object Detection Model 19
P2.2.2 Part 2: Find the Centroid 21
P2.2.3 Part 3: Mark the Centroid on the Image 22
P2.3 Optional: Using General Purpose IO Pins 23
Project 3: Edge Detection 24
P3.1 Finding the Slope Local Maxima for a Two Variable Function 24
P3.2 Application Example: Edge Detection Algorithm 26
P3.3 Implementing the Algorithm on Raspberry Pi 27
P3.4 Computing the Gradient as Two Dimensional Convolution 28
P3.5 Analyzing Alternative Algorithm Implementations 30
P3.6 Brief Task Overrun Analysis 32
Appendix 1: Background and References 34
Appendix 2: System Requirements and Setup 35
Appendix 3: Working with Raspberry Pi 38
Appendix 4: Raspberry Pi Driving Hardware using GPIO Pins 39
Page 3 of 40
Motivation for this Workshop
As the industry, technology and society make rapid progress, there is a growing need to teach the next generation
of engineers ever more complex concepts and build intuition quickly, so that they can apply their knowledge to
develop the technology of the future. This calls for hands-on and project-based learning via low-cost, easy to use
hardware and software platforms, to make it easier and fun to teach, learn and test the engineering ideas.
Simulink® has long been the tool of choice of the industry (automotive, aerospace, electronics, etc.), since it
provides a very user-friendly, collaborative and flexible environment for designing, simulating, testing and
eventually implementing, complex, multi-domain systems and their control logic.
Simulink includes the capability to program low-cost hardware like LEGO® Mindstorms® robots, Arduino®,
Raspberry Pi®, and many others. This new capability enables students to develop and test a variety of controls,
signal, image and video processing related applications, from within Simulink and Stateflow®.
This workshop is based on Simulink Support Package for Raspberry Pi. You will have a chance to work through lab
modules with examples of image and video processing. They will gain practical hands-on experience in building
high-level examples themselves. Additionally, participating faculty members would have a chance to understand
the potential for use in the classroom with students.
Getting Started
If you are new to MATLAB®, Simulink or Simulink Support Packages, take a look at the following introductory
material to get started:
a. To learn more about MATLAB and Simulink, check out interactive tutorials at:
mathworks.com/academia/student_center/tutorials/
b. For the latest information about Raspberry Pi Support from Simulink, see:
mathworks.com/hardware-support/raspberry-pi.html
c. Supported hardware for project based learning:
mathworks.com/academia/hardware-resources/
Page 4 of 40
Introduction: How to Use and Work with this Manual
Recommended
Section Topics Covered Level
Experience
I1, I2 Run-through options, Notation Introductory None
and Formatting
P0.1, P0.2, Getting Familiar with Simulink Beginner None
P0.3
P1.1 Checking Product Installation Beginner Simulink
P1.2 System Design with Simulink Experienced Simulink, SSP* for RPi
P2.1 – P2.4 Image Inversion/ Object Experienced Simulink, SSP* for RPi
Detection
P3.1 – 3.6 Edge Detection Advanced Simulink, SSP* for RPi
P4.1 – 4.3 Optional: GPIO & C Programming Experienced Simulink, SSP* for RPi
*SSP = Simulink Support Package
Page 5 of 40
Getting Familiar with Simulink
P0.1 Simulink
Simulink® is a block diagram environment for multi-domain simulation and Model-Based Design. It supports
system-level design, simulation, automatic code generation, and continuous test and verification of embedded
systems. Simulink provides a graphical editor, customizable block libraries, and solvers for modeling and
simulating dynamic systems. It is integrated with MATLAB®, enabling you to incorporate MATLAB algorithms into
models and export simulation results to MATLAB for further analysis.
Key capabilities
With the help of code generation products like Simulink Coder and Embedded Coder, Simulink models can also be
converted to C-code optimized for specific embedded platforms.
The Simulink Library Browser is a collection of high level blocks that you can use to
create a block-diagram representation of the system you are trying to design. From
a different perspective, these blocks allow you to access or generate, apply
algorithms and visualize or save the processed data or information, which flows
through the system. Figure 2: Launch Simulink
Library Browser
Once the Simulink Library Browser launches (Figure 2), you will see a window like
Figure 3 below. Depending on the products included in the MATLAB installation, you will see some or all of the
block libraries. In particular, we will work with the Simulink Support Package for Raspberry Pi Hardware, which is
highlighted below.
Page 6 of 40
A Simulink Model (Figure 4) represents the high level design of a system or an algorithm. You can create models
by dropping blocks from the Simulink Library. After that, you can run the simulation or deploy it to the hardware.
Page 7 of 40
C F
D A G
A
A
E
Say for example XYZ Automobiles is trying to design a new car. As you can imagine, a car is a complicated system
with different mechanical, electrical and hydraulic components, whose interaction is controlled by intelligent
computers running embedded code. Any mistakes in the design process can result in significant revenue loss for
XYZ Automobiles. To avoid this, they will actually design a car and test a significant part of its performance in a
computer simulation environment, before they even make it.
Page 8 of 40
The industry standard way to do this is by creating mathematical models of different components, and then
simulating their interaction based on intelligent algorithms, that make sure that things behave in a nice way. This
approach is becoming standard in almost all automotive, aerospace, defense and hi-tech industries now.
A Simulink model allows you to design these intelligent algorithms and test them, before running them in the real
world. In addition to that, the intelligent behavior can also be translated to code in different low-level languages
– e.g. C, HDL, PLC text – that can be deployed directly to embedded controllers like the Engine Control Unit (ECU)
in your car.
We will work with the code generation capability of Simulink for the rest of this workshop. Let’s take a look at the
simulation capabilities to start with.
Once you have created a Simulink model (Figure 6) representing your system, and set up appropriate simulation
parameters, you can just click on the green RUN button, and observe the behavior of your system over time.
When you click on the green run button (Figure 5), the Simulink model that you created with built-in or custom
blocks is translated to an equivalent differential equation. Simulink then uses one of the built-in differential
equation solvers, to solver this differential equation over time, and consequently, simulate the dynamic behavior
of the system.
Given the complex nature of this whole process, there are several pieces of information needed from the system
designer – e.g. the nature of data exchanged between blocks, some notion about the dynamic nature of the
system, simulation time and step size, interaction preferences, etc. All of these can be controlled by changing the
appropriate settings in the ‘Model Configuration Parameters’ (Figure 8) of each Simulink model.
To learn more about key features of Simulink, check out the following webpage:
mathworks.com/products/simulink/features.html
a. Open the model for simulating a bouncing ball, by typing sldemo_bounce at the MATLAB Command
Prompt.
b. Check model configuration from Simulation > Model Configuration Parameters (Figure 8)
c. Run the model by clicking on the green [Run] (Figure 5) button to run the simulation and observe the behavior
(Figure 7).
Page 9 of 40
Figure 5: Run Button
MATLAB Desktop > Help > Documentation > Simulink > Examples
Page 10 of 40
Project 1: Blink LED – First Application with Simulink and Raspberry Pi
Tasks/Challenge:
Steps/Approach:
1. If a model is already running, click the stop button (Figure 10).
2. Within MATLAB, navigate to the Working directory.
3. Open the model BlinkLED1.slx by double-clicking on the file in the Current
Figure 10: Stop button
Directory browser.
4. Check that the Raspberry Pi is powered on and connected to the host PC with a Ethernet cable
Select BlinkLED1.slx > Tools > Run on Target Hardware > Options menu item, and check for the following
settings (Figure 11), note your Host name IP address and Click [OK] to close the window.
Note: Your Host name IP address may be different; common values are: 169.254.0.31, or 169.254.0.2.
Page 11 of 40
Figure 11: BlinkLED1 Configuration Parameters
5. To test your connection to the Rapsberry Pi, at the MATLAB prompt type !ping 169.254.0.31 (or
the using the Host name IP address of your Raspberry Pi from step 4). Ensure that you get a positive
response and not a “Request timed out” reply.
6. On the BlinkLED1 model window, [Deploy] the model to the Raspberry Pi (Figure 12). Note that:
This action initiates the build, download and run procedure. Progress is reported in the status bar at
the bottom left of the model window, terminating with the message, ‘Model successfully downloaded
…’
The build process automatically starts / runs the executable on the Raspberry Pi as can be observed
from the blinking LED (LED0) on the Raspberry Pi (Figure 13).
Figure 12: Deploy to Hardware button Figure 13: LED0 on Raspberry Pi board
Additional notes:
It may take a few seconds to open a Simulink model, unless you have already started Simulink.
Page 12 of 40
It is not possible to compile the model without having a Raspberry Pi connected. Simulink will produce
an error message.
Tip: If you do not have a Raspberry Pi, you can test the syntactic integrity of your model using the
menu option Simulation > Update Diagram, (or simply [Ctrl-D]).
The program running on the Raspberry Pi can be started and stopped from MATLAB as follows:
o At the MATLAB prompt assign h = raspberrypi. This will create a Raspberry Pi object in
MATLAB which can then communicate with the unit. You can note the parameters of the
object.
o Type h.connect at the MATLAB prompt which connects the object to the Raspberry Pi.
o h.stop(’BlinkLED1’) stops the program running and h.run(’BlinkLED1’) runs
the application BlinkLED1.
o It is also possible to log on to the Raspberry Pi via a shell. h.openShell(’ssh’) opens a
shell. Username is pi and password raspberry. The program can also be run and stopped like
all other Linux application at the OS level of the Raspberry Pi.
Page 13 of 40
4. After configuring the Pulse Generator, Convert and LED blocks according to the settings in the table
above, their dialog boxes should look like Figure 14:
Figure 14: Block Settings for Pulse Generator & Convert blocks for BlinkLED1.slx model
Note: We use a sample-based pulse generator, since running on hardware requires fixed-step solvers.
5. Use Display > Signals and Ports > Port Data Types and select the Simulation > Update Diagram menu
option (or Ctrl + D shortcut) to highlight signal types and potential mismatch/truncation issues.
Simulink ensures that the data types of Raspberry Pi block inputs are automatically type cast to the
appropriate data type for the underlying hardware, but it is good practice to be aware of the data
types used in your model.
6. To configure the model for real-time implementation on the Raspberry Pi select Tools > Run on Target
Hardware > Prepare to Run… and check for the following settings in the Run on Target Hardware and
Solver panes (see Figure 15 & 16):
Page 14 of 40
Figure 15: Run on Target Hardware pane
Page 15 of 40
7. Click [Apply] to apply the changes and [OK] to dismiss the dialog. It is also good practice to [Save] your
work before proceeding to run the model.
8. Verify that the Raspberry Pi is powered-on and connected.
9. [Deploy] the model to the Raspberry Pi (Figure 12).
10. Once the program starts running you should see the blinking LED (Figure 13).
Additional notes:
The program can be started and stopped from MATLAB as follows:
o At the MATLAB prompt assign h = raspberrypi. This will create a Raspberry Pi object in
MATLAB which can then communicate with the unit.
o Type h.connect at the MATLAB prompt which connects the object to the Raspberry Pi.
o h.stop(’myBlinkLED1’) stops the program running and h.run(’myBlinkLED1’)
runs the application BlinkLED1.
Page 16 of 40
Project 2: Object Detection with Video Processing
Tasks/ Challenge: Design and build an application that takes an video input and inverts the colour values. Also
use external mode to change the “inversion” threshold and examine the effect on the resulting video.
Steps / Approach:
2. Apply the same procedure to construct the ImageInversion model shown in Figure 17. You must configure
the model for running on Raspberry Pi. You’ll also need to enable External Mode* as shown in Figure 18.
When using External Mode, you need to use the RUN button to deploy the model, and the STOP button
to stop the model (Figure 18). The resulting display is shown in Figure 19.
* External Mode (Figure 18) allows you to run your model on the Raspberry Pi, while still using Simulink
on your host computer as an interface to interact with your model. This allows for live parameter tuning
– you do not need to stop, edit and restart the model. You just change the parameter (e.g. value in
constant block in Figure 17) and the parameter is automatically passed to the Raspberry Pi which uses the
new parameter without pausing it’s operation. External Mode also allows you to log and view data while
using Simulink window. With this model we can see the output of the model with the SDL Display block
(Figure 18) on our host computer.
Note: Remember when using External Mode, you need to use the RUN button to deploy the model, and
the STOP button to stop the model (Figure 18).
Note: If your model stops working after 10 seconds, you may have forgotten to adjust the Stop time in
the Solver pane (Figure 16).
Page 17 of 40
Note: To set the Constant Output Data Type to uint8, use the Signal Attributes tab. Double click on the
Constant block and go to the Signal Attributes tab in the dialogue box.
Figure 18: External Mode, Run, and Stop buttons Figure 19: Image Inversion display
Page 18 of 40
P2.2 Object Detection: Simple Object Detection
Objective: To build an application that detects a green object and is able to detect it around the screen by placing
a red square in the centroid of the object. Learn about the following product features: external mode, data logging
features of scopes, how to build subsystems, MATLAB functions and use built-in functions.
Task/ Challenge: Build a Simulink model in stages that is capable of detecting a green objects in the input video.
The idea is to perform an appropriate arithmetic transformation on the RGB components of the colour image to
Inside Subsystem
Constant Constant Value: 255
Icon Shape: Round or Rectangular
Sum
Commonly Used Blocks List of signs: --+
Inport
Outport
Math Operations Product No of Inputs: */
Gain Gain: 255
Logic and Bit Operations Relational Operator Relational Operator: >=
Page 19 of 40
2. Build ObjectDetection.slx model as in Figure 20 & 21. The Subsystem block helps you visually organize
your model and does not affect the numeric calculations or operation on the Raspberry Pi. Inports and
Outports of the Subsystem block define the interface to the higher level in the hierarchy.
3. Note that all of the computations are to be performed in uint8. (Answers on last page of manual)
a. How would you ensure that the output of the blocks are of the correct type at the end of each
block computation? Hint: Display > Signals & Ports > Port Data Types
b. Why should the value of Gain be 255 to display a binary image?
c. How would you ensure that the Add block does not overflow?
Page 20 of 40
4. Ensure that the external mode is enabled (Figure 18).
5. Having ensured the Raspberry Pi is powered-on and connected, click the RUN button to deploy the model
in external mode.
6. What do you see on the display? Modify the value of the Threshold and see how it affects the size and
shape of the binary image.
7. Add the Scope block and user defined MATLAB Function block, and then insert the code as shown below
in Figures 22 & 23.
Page 21 of 40
function [pos, detect] = FindCentroid(bw)
%#codegen
[r,c]=find(bw);
if isempty(r)
pos = [-1; -1];
detect = false;
else
pos = [mean(r); mean(c)];
detect = true;
end
Figure 23: Finding the centroid script
8. You can use the logging facility in the Scope (logging the historical information stored by the Scope) to
record the time response of the centroid of the object. To do this, open the Scope and click Parameters
(gear icon) in upper left of the Scope window and edit the History tab (Figure 24).
9. Log some data and plot the results in MATLAB to look at the trajectory of the object over time. Note that
you can do time responses as well as plotting the x, y coordinates against each other.
10. You can plot the data using plot or simplot commands.
Page 22 of 40
11. Create a model ObjectMarker.slx as below (Figure 25 & 26):
12. Having ensured the Raspberry Pi is powered-on and connected, click the RUN button to deploy the model
in external mode.
13. Test your model with a green object. Do you see a red square tracking the centroid of your green object?
What happens if you introduce a second green object?
Page 23 of 40
Project 3: Edge Detection
P3.1 Finding the Slope Local Maxima for a Two Variable Function
Objective: To build an application that computes the slope local maxima for a two variable function. Learn to
simulate the system model, explore parameter tuning, and show simulation results.
Tasks/ Challenge: Build a Simulink model that determines the slope local maxima for a two variable function.
The idea is to implement a two-step approach that first determines the slope of the two-variable function by
computing the gradient magnitude of that function, and then determines the slope local maxima coordinates by
finding where the slope function exceeds a given threshold.
Figure 27 & 28 show the one-dimensional case, while Figure 29 & 30 depict a two-variable function and its
gradient magnitude, respectively.
Figure 27: Steep slope of a single-variable function. Figure 28: Maximizing the gradient function with
It may be interpreted as the strong image contrast thresholding. It may be interpreted as the image
for an edge (one-dimensional view) edge detection algorithm (one-dimensional view)
Figure 29: Two-variable function plot Figure 30: Gradient magnitude function
Page 24 of 40
Steps / Approach:
1. Open the MaximumSlope_start.slx by double-clicking on the file in the Current Directory browser (Figure 31).
Note that this model uses model callback functions to set variable z and intitailise the figures. This is
a common way to initialise data in a Simulink model.
2. [Search] for the blocks listed in the table below and [Drag and Drop] them into the MaximumSlope_start.slx
model.
3. Rename the Subsystem as MaximumSlope, double click it and build the algorithm subsystem with the blocks
of the above table, as in Figure 32.
4. Double click the MATLAB Function block and edit as follows:
function y = ComputeGradient(u)
%#codegen
5. Double click the Gain block. In the Signal Attribute tab, set the Output data type as uint8.
6. Set the Simulation Stop Time to inf.
7. [Run] the model in “Normal” Simulation mode. While the simulation is running, double click the Threshold
block. Modify its value and see its impact on local maxima detection (Figure 33).
8. [Save] your model as MaximumSlope.slx in the working directory.
Page 25 of 40
Figure 31: Starting model for slope local maxima Figure 32: Maximum slope subsystem
Objective: To build an algorithm that can detect the edges on an input image from a file or an image acquisition
device.
Tasks/ Challenge: Given that the image intensity may be regarded as a two-variable function, build a Simulink
model that detects the edges by computing its slope’s local maxima, as explained in the previous Section (Figure
26).
Steps / Approach:
1. Open the edgeDetectionReference_start.slx by double-clicking the file in the Current Directory browser
(Figure 34).
Page 26 of 40
2. Open your previously saved MaximumSlope.slx model, copy the MaximumSlope Subsystem and paste
it on edgeDetectionReference_start.slx.
3. Connect all the signals properly and save the model as edgeDetectionReference.slx in the current
directory.
4. [Run] the model in “Normal” Simulation mode. While the simulation is running, double click the
Threshold block. Find an appropriate value that provides accurate edge detection on the test image
(Figure 35).
Optional:
5. You can test the algorithm on real-life images, after they are converted to grayscale. To try the model
with an example image (which has been converted to grayscale for you), double-click the Constant block
called TestImageBitMap, and replace Constant value field with ylena.
Note: The constant block may act as an image source by defining an array (ylena in our case) as a constant,
which is the pixel-based representation of a given grayscale image.
6. [Run] the simulation and tune the threshold to determine proper edge detection on such image.
Figure 34: Reference Edge Detection starting model Figure 35: Edge Detection result on test image
Tasks/ Challenge: Reuse the algorithm that has been verified through simulation, and build the Simulink model
to be deployed ion the target hardware.
Page 27 of 40
Steps / Approach:
Figure 36: Edge Detection starting model Figure 37: Live Edge Detection on Raspberry Pi
Tasks/ Challenge: Build a Simulink model that implements the gradient computation as a two-dimensional
convolution.
As shown in the previous Section, edge detection requires computing the derivative of a two-dimensional image.
The Sobel edge detector uses intensity values only in a 3×3 region around each image point to approximate the
corresponding image gradient. More precisely, it uses a pair of 3x3 convolution masks, one estimating the
gradient in the x-direction (columns) and the other estimating the gradient in the y-direction (rows). As a result,
the mask is slid over the image as a 2-D convolution with that image, manipulating a square of pixels at a time.
Page 28 of 40
The actual Sobel masks are shown below:
1 0 -1 1 2 1
2 0 -2 0 0 0
1 0 -1 -1 -2 -1
Steps / Approach:
function y = ComputeGradient(u)
%#codegen
ky = [1, 2, 1;
0, 0, 0;
-1, -2, -1];
% 2-D convolution
gx = conv2(u,kx,'same');
gy = conv2(u,ky,'same');
3. Ensure the Raspberry Pi is powered-on and connected to the computer through the Ethernet cable.
Connect the webcam to the USB port on the Raspberry Pi.
4. Click the RUN button to run the model in external mode. Double click the Threshold block. Tune its value
to determine proper edge detection on the video sequence
Page 29 of 40
P3.5 Analyzing Alternative Algorithm Implementations
Objective: To simplify the computation of the gradient function. Compare the results to the reference algorithm
and evaluate the difference through simulation.
Tasks/ Challenge: Build a Simulink model that implements the gradient computation by means of difference
functions. Implement a testbench to measure the difference between the golden reference and the simplified
algorithm.
In the previous section, we learnt that, according to the Sobel method, the horizontal and vertical gradient
components may be computed as 2-D convolutions of the input image with two specific 3x3 masks.
Let us now consider alternative, simpler ways to implement such algorithm.
In order to estimate the gradient components 𝐺𝑥 and 𝐺𝑦 for a given image pixel at position (i, j), we can make the
following assumption: we consider only the contribution of the pixels laying on the ith row and the jth column to
compute 𝐺𝑥 and 𝐺𝑦 , respectively. In other words, we can use a simple difference function to approximate the
computation of the derivative in the horizontal and vertical directions, as depicted in Figure 38.
𝐺𝑥 (𝑖, 𝑗)
Steps / Approach:
Page 30 of 40
3. Open the Subsystem and edit the MATLAB Function block as follows:
function y = ComputeGradient(u)
%#codegen
gx = zeros(size(u),'single');
gy = zeros(size(u),'single');
Page 31 of 40
6. [Run] the model in “Normal” Simulation mode. and observe the difference image. Try and minimize the
difference by tuning the thresholds.
Optional:
7. Change the input image to ylena and observe the results.
8. Open the previously saved edgeDetectionFilter_RPi.slx model. Replace MaximumSlopeFilter Subsystem
with MaximumSlopeSimplified. [Save] the model as edgeDetectionSimplified_RPi.slx.
9. Click the RUN model in external mode and validate the simplified edge detection algorithm on the live
video from the webcam.
Tasks/ Challenge: Run the Simulink model on the target hardware by enabling the overrun detection to see if
any task overrun occurs during execution. A task overrun occurs when a task is scheduled to execute before a
previous instance of the same task has completed
Standard scheduling works well when a processor is moderately loaded. However, if the processor becomes
overloaded because of the high amount of processing involved with a specific algorithm, task scheduling may fail,
thus affecting system performance.
Frame dropping is a typical effect of task overruns. The Simulink Support Package for Raspberry Pi enables actual
task overrun detection for a specific model running on the target hardware.
Steps / Approach:
Page 32 of 40
4. On the Simulink model window, [Deploy] the model to the Raspberry Pi. Wait for the model to run on
Raspberry Pi.
5. The windows command prompt will show the detected overruns because of too fast rate for the task, as
shown in Figure 40.
Page 33 of 40
Appendix 1: Background and References
Background:
The Raspberry Pi is a credit-card sized, low-cost, single-board computer with audio and video input/output,
designed for teaching. The Raspberry Pi features a Broadcom® system-on-a-chip which includes an ARM
processor 512 or 1GB MB RAM, and a VideoCore IV GPU. Raspberry Pi provides peripheral connectivity for
stereo audio and digital video (1080p) and supports USB and Ethernet.
In order to harness this capability, however, it is necessary to have an appropriate programming pathway. In
this context, the Simulink enables high-level Simulink models to be automatically cross-compiled for execution
on the Raspberry Pi without users having to engage in low-level programming. Furthermore, Simulink’s
‘External Mode’ capability allows users to interact with, monitor and tune the code as it executes fully
autonomously on the Raspberry Pi.
In addition, to Raspberry Pi Support from Simulink [4], starting in R2014a, there is also Raspberry Pi Support
from MATLAB [3]. This provides a library of MATLAB functions which allow you to acquire data from sensors and
imaging devices attached to a Raspberry Pi. However the MATLAB code runs on the host computer and not as a
standalone application on the Raspberry Pi. In order to run a model or algorithm on the Raspberry Pi you would
use the Raspberry Pi Support from Simulink and the approach detailed in this manual.
MATLAB and Simulink support for low-cost hardware such as LEGO MINDSTORMS robots, Arduino, and Raspberry
Pi is a professionally supported feature developed by the MathWorks, [1, 5, 6]. However, other tools with similar
aims have previously been available on the MathWorks File Exchange site.
References:
Page 34 of 40
Appendix 2: System Requirements and Setup
Step 1: Install and ‘Select’ a C Mex Compiler for MATLAB and Simulink
Simulink requires a C compiler in order to compile auto-generated C-code,that would be downloaded on the
Raspberry Pi. This compiler must be installed and ‘selected’ prior to using Support Package for Raspberry Pi
Hardware.
1. Check whether you are running the 32-bit (PCWIN) or 64-bit (PCWIN64, MAC64) version of MATLAB by
typing computer at the MATLAB prompt.
2. 32-bit Versions of MATLAB ship with a C mex compiler, but the installed compiler must also be ‘selected’.
Type mex –setup at the MATLAB prompt and following the on-screen prompts to do this.
3. 64-bit versions of MATLAB do not ship with a C compiler. Download and install a supported compiler. If
using the free Microsoft SDK, first install the .NET Framework 4.0, then install the Microsoft SDK. Once
installed, the compiler must also be ‘selected’ as described in Step 2 above.
Step 2: Automated Download and Install of the Raspberry Pi Support from Simulink
Brief installation notes are provided below. More detailed installation instructions, including screen shots can be
found in Help > Simulink > Target Hardware > Raspberry Pi > Install Support for Raspberry Pi Hardware
Page 35 of 40
1. On the MATLAB Toolstrip click Add-Ons and select Get Hardware Support Packages
Page 36 of 40
5. Test your connection between computer to Raspberry Pi)
a. Type at the MATLAB command line: !ping 169.254.0.31
b. You should see the result shown below:
You have now successfully setup and tested your Raspberry Pi with webcam.
Page 37 of 40
Appendix 3: Working with Raspberry Pi
1. Ports: 2 to 4 USB ports, micro USB port for power, Ethernet port, SD card / micro SD socket, HDMI output
port, and a set of GPIO pins (26 or 40).
2. Dimensions: ~85mm x 56mm x 21mm, with overlap for the SD / micro SD card and connectors. Weight is
~45g.
3. Power: 5V 700 mA via the micro USB port
4. Models: Raspberry Pi 1 Model A (retired), Raspberry Pi 1 Model B (retired), Raspberry Pi 1 Model A+,
Raspberry PI 1 Model B+, Raspberry Pi 2 Model B
Note: Simulink currently only supports Raspberry Pi 1 Model B & Model B+, and Raspberry Pi 2 Model B
1. Are there different versions of the Support packages? How do I know what version of a Support
package I have? How do I update my Support Package?
a. Yes, in some cases there are updates to Support Packages
b. To check the version of your support package is up to date, start the support package installer
and check the version as shown below:
c. You can click Update check box and Next to update the support package.
Page 38 of 40
2. What do I do with the cmd.exe windows?
a. Close them by click on the red X in upper right once they say **Simulation finished**
4. I get intermittent errors where the Raspberry Pi, webcam, or circuits stop working.
a. This is likely due to being unable to source enough current/voltage to drive everything from the
host computer USB port, especially after the voltage drop from the USB-Micro-USB cable.
b. Workaround: You will need to purchase a USB power adapter (minimum 5V, 1A). These are
commonly available, and often used for Android devices.
Page 39 of 40
Answers to Questions in Manual:
Q: P2.1: What happens when you change the constant 255 to other values?
A: P2.1: You see the colours changing as the numbers representing the bitmap change. Sometimes you see
massive changes in colour which is because the numbers are uint8 and you get wrap around in the arithmetic.
Q: P2.2.1: Step 3: Note that all of the computations are to be performed in uint8.
a. How would you ensure that the output of the blocks are of the correct type at the end of each block
computation? Hint: Display > Signals & Ports > Port Data Types
b. Why should the value of Gain be 255 to display a binary image?
c. How would you ensure that the Add block does not overflow?
A: P2.2.1: Step 3:
a. In the signal attribute tab of each block examine the Output data type. Where appropriate you can
change it to uint8 or select one of the “inherit” options. On input side, if appropriate, you can select the
check box require all inputs to have same data type. Simulink allows many degrees of freedom in how
the computations are performed.
b. 255
c. You can select Saturate on integer overflow tick box in Signal Attributes tab
Q: P3.5:
1. How do the results of step 8 above using the simplified algorithm compare with the reference model?
2. How would you measure the difference? What you could do to minimize it?
3. Would it be possible to simplify further the edge detection algorithm?
A: P3.5:
1. Generally speaking, the results are different because of the different edge detection implementations.
2. The difference may be computed as the mean value of the absolute difference (mean error) of the two
processed images. To minimize the difference, one could run multiple simulations and identify the
minimum mean error by storing its value against varying threshold values of the two algorithms.
3. The above formula simplifies the computation of the gradient vector magnitude by approximation,
hence reducing the computational load.
Q: P3.6:
a. How can you reduce the video input frame rate for your edgeDetectionFilter_RPi.slx model?
b. What you think it would be the maximum video input frame rate avoiding overruns? How would you verify
that?
A: P3.6:
a. Double click V4L2 Video Capture block. The sample time parameter from 1/fps sets the input frame rate
(fps = 12 in Model Properties -> Callbacks -> PreloadFcn). Change 1/fps to something like 1/5.
b. The maximum video input frame rate may be experimentally determined by running multiple
simulations with different video input frame rates and the ‘Enable overrun detection’ flag checked.
Page 40 of 40