{"id":9762,"date":"2015-03-24T14:01:40","date_gmt":"2015-03-24T03:01:40","guid":{"rendered":"http:\/\/legoeng.local\/?p=9762"},"modified":"2022-07-18T15:51:49","modified_gmt":"2022-07-18T05:51:49","slug":"inside-a-two-step-simple-line-follower","status":"publish","type":"post","link":"http:\/\/legoeng.local\/inside-a-two-step-simple-line-follower\/","title":{"rendered":"Inside a Two-Step Simple Line Follower"},"content":{"rendered":"

\"http:\/\/www.bogatech.org\/\"<\/a>In this unit, we will use the datalogging tools available in the EV3 Software to study the internal operation of a two-step simple line follower with one Light sensor. It is a very simple but powerful example that can help students to understand and to correctly program other situations.<\/p>\n

One of the most important functions in both robotics and computer science is the capacity to record information to analyze it in real time or later on. Although it is possible to use the EV3 brick\u2019s screen to display messages and sensor readings, it is not always possible to stop the robot to read its screen. In those cases, we need either to record the data to analyze it later, or display data in real time on the computer, if it is connected to the EV3 brick.<\/p>\n

This unit explains in detail the concept of datalogging by means of manually accessing files, the management of these files from the EV3 Software using uploads and downloads, and their subsequent query and analysis. We will use data conversion, logic, and data wires to control the program flow.<\/p>\n

It also takes advantage of the datalogging feature within the EV3 Software to automatically record data and visualize it graphically in the computer screen. This capacity to generate and relate the sensors\u2019 readings to different curves as a function of time can be extremely useful in many and diverse situations.<\/p>\n

Finally, it should be mentioned that the datalogging can be a very useful way of debugging a computer program, given that, in many occasions, it allows to understand much better the program flow and the reasons of its decision making process, like for example, how a switch works inside a loop.<\/p>\n

This unit is based on my \u201cTeachers Introduction Course to LEGO\u00ae Mindstorms NXT & EV3<\/a>\u201d at BOGATECH<\/a>\u2019s website, where you can find NXT versions of these programming examples and the differences between the NXT and EV3 versions, if you are interested.<\/p>\n

Exercise overview and preparation<\/h2>\n
\"Challenge
Challenge exercise field setup<\/figcaption><\/figure>\n

The first exercise we will do is very simple, we just need a line with an intersection. It will, however, be useful to construct the field for this unit\u2019s final challenge exercise, which is a circuit I like to call a \u201ccage\u201d that the robot has to follow and that will be very useful for all the exercises.<\/p>\n

Use black tape on a white background to build the black line the robot will follow, as shown in the image to the right, where we can also see the path the robot will have to follow to do the final challenge exercise.<\/p>\n

Each team will need a simple two-wheeled robot with a Color sensor (to be used as a Light sensor) attached to the front of the robot, as shown in the image (e.g. LEGO MINDSTORMS Education EV3 Software > Robot Educator > Driving Base + Color Sensor Down). \"line<\/p>\n

A little bit of theory: Basic line following
\n<\/strong><\/h2>\n

How can we make a robot follow a black line on a white background?<\/p>\n

There are several solutions, but with only one Color sensor (used as a Light sensor) the simplest solution is probably to make it follow one of the edges<\/em> of the line. Because the line has thickness, we can program the robot to follow the line in such a way that when the sensor \u201csees\u201d white, the robot makes a point turn (a turn with a stopped wheel) towards black, and when it \u201csees\u201d black it does the opposite turn. This algorithm is called a “two-step simple line follower” because it only has two actions. In addition, it demonstrates how the combination of small local movements, apparently nonsense (turns to the right or to the left), imply a global movement with a very special sense, in this case following the line. To program this algorithm, we just need a Switch block set to Color sensor mode (to make the robot turn to each side), inside a Loop.<\/p>\n

\"Two-step
Two-step simple line following algorithm<\/figcaption><\/figure>\n

If we study how a two-step simple line follower works, as shown in the image below, we can ask ourselves how many times the program passes through each one of the switch options before changing the option, while the time loop of our example lasts.<\/p>\n

\"Two-step<\/a>
Two-step simple line follower with one color sensor, with Motor blocks<\/figcaption><\/figure>\n
\"Two-step<\/a>
Two-step simple line follower with one color sensor, with Move Tank block<\/figcaption><\/figure>\n

In the previous examples the motors are configured with 30% power, to avoid swerve movements, and the time loop runs for 10 seconds. The Switch block is associated with the Color sensor in the \u201cCompare \u2013 Reflected Light Intensity\u201d mode. In the next example, the Color sensor uses the \u201cMeasure \u2013 Color\u201d mode to measure the color.<\/p>\n

\"Two-step<\/a>
Two-step simple line follower with one color sensor, with \u201cMeasure \u2013 Color\u201d mode<\/figcaption><\/figure>\n

Tip<\/strong>: When using the Color sensor with the \u201cMeasure \u2013 Color\u201d mode to measure the color, it is important to select the Switch default case, for example, the white color. We can also select \u201cNo Color\u201d as default in such a way that any detected color by the Color sensor not being black will execute this case. How will the robot behave if we do not configure the \u201cNo Color\u201d case as default? Well, since you need to select a default case, the only other option will be to select the black color case, thus the program will always execute this branch, even while detecting white, because any detected color is different from \u201cNo Color\u201d! It is important that the students make tests to understand how the Switch block works.<\/p>\n

At this point it is important to make the students give their opinion about what’s happening inside the Loop and, for example, make them replicate the robot behavior with their body. We will see that the majority of them do not clearly understand how the algorithm operates. For example, they may not fully appreciate that as the robot moves from one color to the other, the program will execute the same<\/em> option or branch of the Switch many times, until the robot moves onto the other side – many  faster than the robot’s apparent physical execution of the Loop. The following exercises show  how the Switch inside the Loop algorithm works for the two-step simple line follower with one Color sensor.<\/p>\n

Exercise 1. Data logging using file access: A numerical and manual debugging strategy<\/h2>\n

The simplest way to demonstrate that the EV3 brick executes the program much faster than the robot\u2019s apparent physical execution of the same program is to record the Switch branch that is being executed in each iteration of the Loop.<\/p>\n

This exercise uses a File Access block to record in a text file the Switch branch that the program has just executed. The final step is to check the generated text file. Create a new program, build the line follower basic algorithm and add the File Access blocks, as shown below.<\/p>\n

First we need to add an initial block configured with the \u201cWrite\u201d action to write to a file. Set the name of the file we want to create, for example \u201cmotor\u201d, and the type of information we want to add, then choose the option \u201cText\u201d of the \u201cType\u201d pulldown menu, to add an initial text \u201c—Start—\u201d, that will help us to know when the data logging starts in the text file.<\/p>\n

Next we will add two \u201cFile Access\u201d blocks similar to the previous one at the end of both branches of the switch to write in the file the branch that the program just passed through. For example, when the program goes through the option where the motor B runs we will add \u201cB\u201d and when it goes through the other option we will add \u201cC\u201d, in the textbox \u201cText\u201d. To finish the program, we need to close the file before stopping the program. For this we need to add a final block with the action \u201cClose\u201d to close the file. Be careful to always write in the appropriate filename, \u201cmotor\u201d in our case. To avoid errors, a good option is to copy any previous block and modify the desired options.<\/p>\n

\"Simple<\/a>
Simple line follower program with \u201cFile Access\u201d blocks<\/figcaption><\/figure>\n

Once the program has been compiled and executed by the robot, connect the robot to the computer to find the generated file. For this click on the \u201cOpen Memory Browser\u201d button in the program\u2019s window lower right corner to open the EV3 memory management window where we will find the text file \u201cmotor.rtf\u201d.<\/p>\n

\"Button
Button to open the intelligent brick memory management<\/figcaption><\/figure>\n
\"Intelligent
Intelligent brick memory management window<\/figcaption><\/figure>\n

Now just upload the file from the EV3 brick by clicking on the \u201cUpload\u201d button, and save it in the computer to the desired location.<\/p>\n

\"Supposed
Supposed contents of the \u201cmotor.rtf\u201d file<\/figcaption><\/figure>\n

Open the \u201cmotor.rtf\u201d file (e.g. with Microsoft Word) to view its contents. Note that the generated text file has been saved with a “.rtf” extension. Before opening the file, it can be interesting to ask the students about what they think its contents will be and make them fill in the file manually on the black board. Any students who think that each switch\u2019s branch will be executed only once while the robot goes from one color to the other will presumably write: \u201cB, C, B, C, B, …\u201d, as shown to the right.<\/p>\n

When opening the file we can observe that each letter is repeated many times, indicating that the program goes through the same Switch\u2019s branch many times before changing to the other branch inside the Loop. Thus, the program executes the Loop much faster than the robot movements, as shown in the images below.<\/p>\n

\"Real
Real contents of the \u201cmotor.rtf\u201d file<\/figcaption><\/figure>\n

The next challenge is to count how many degrees each motor turns before changing the branch of the Switch inside the Loop. The number of degrees turned by each motor will mainly depend on its power, as well as on the robot design. Initially we will suppose a power of 30% to avoid the robot slipping. To accomplish this we will explicitly use the internal rotation sensor of each motor. An important aspect before starting the exercise is to ask the students where to add the internal rotation sensor blocks of each motor. Given that the program passes through the same Switch branch several times before changing branch inside the Loop, as it has been proved, it is important to realize that we need to record the rotations of a motor just before changing the active motor in the switch.<\/p>\n

The following image shows the previous line follower program where internal rotation sensor blocks for each motor have been added. It is important to realize that when a motor stops we need to count its rotation degrees, we need to reset the sensor, and finally, we need to record these rotation degrees in the text file. To accomplish this process we need to connect the data hub plug coming out of each motor internal rotation sensor block, corresponding to the motor degrees of rotation, to the corresponding \u201cFile Access\u201d block data hub text input.<\/p>\n

\"Initial<\/a>
Initial line follower program with file access and internal rotation sensor blocks<\/figcaption><\/figure>\n

Note: As we can see in the previous image, the \u201cFile Access\u201d block is capable of automatically converting the numeric values of the rotation degrees into a text string and we do not need to use a conversion block, as we might need to in other programming languages.<\/p>\n

But have we obtained the desired result yet? If we analyze in some detail the result we want to obtain in the text file, we will see that we not only need to record the rotation degrees, but we also need to record which motor has turned these degrees. Otherwise, we will obtain a bunch of numbers and we will not know which motor they refer to. For this, we need to add the letter of the Motor \u201cB\u201d or \u201cC\u201d to the rotation degrees in each case. Thus, we need to add a new block to concatenate these text strings. This block is called \u201cText\u201d and allows concatenating up to 3 text strings.<\/p>\n

Tip<\/strong>: It is interesting to observe that every time we use the file access block, this writes a new line into the data logging text file. Thus, if we do not concatenate the text strings and we use the \u201cFile Access\u201d block two times in a row, one for each text, we will obtain the motor letter and the rotation degrees in different lines, which will make it difficult to read the datalog file.<\/p>\n

\"Correct<\/a>
Correct line follower program with file access and internal rotation sensor blocks<\/figcaption><\/figure>\n

As mentioned before, when recording the data in the text file we can observe that, to distinguish each motor’s rotation degrees, the motor is shown as well as its rotation degrees. For this, we need to combine the rotation degrees with a text that shows the corresponding motor, by means of a \u201cText\u201d block, for example using the text string \u201cB: \u201d or \u201cC: \u201d for each case, as shown in the above image. Next, compile and download the program to the robot.<\/p>\n

Tip<\/strong>: Every time a data log is generated with the same name, its size gets incremented, and that can end up by filling up all the EV3 brick’s memory. Thus, before the robot executes a program that generates a data log, it is important to delete the previous file from the memory to avoid the program adding data and incrementing the file size, whenever the previous data is not important. If you want, this can be achieved programmatically by adding a \u201cFile Access\u201d block at the beginning of the program with the action \u201cDelete\u201d and selecting the appropriate filename. If we do not delete it, we will easily find each test start by locating the initial text string that the program adds just before starting.<\/p>\n

Finally, upload the \u201cmotor.rtf\u201d file corresponding to the data log to observe and analyze its contents. Before opening the file, ask the students to specify with the maximum detail possible what its contents will be.<\/p>\n

\"\u201cmotor.
\u201cmotor. rtf\u201d file contents with each motor\u2019s rotation degrees<\/figcaption><\/figure>\n

As we expected, we can see that every file entry which captures motor rotations just before changing the Switch branch inside the Loop contains rotation degrees that range from 30 to 50 degrees. But the question is why there are some entries equal to \u201c0\u201d or very small, \u201c1\u201d or \u201c2\u201d? The answer is very simple… When we count the rotations of one motor, the Switch writes the rotations of the opposite motor, which is obviously stopped. If this motor slips a little bit it might actually make one or two degrees of rotation, or it can even slip backwards, and that implies negative degrees, e.g. \u201c-1\u201d!<\/p>\n

How can we improve our data logging file? Let’s eliminate all the data from it that does not provide meaningful information. For this we need to use the internal rotation sensor comparison attribute or threshold value of each motor, compare it to the values we want to record, for example only the ones bigger than 3, and finally, record only these data for the desired case.<\/p>\n

To record only the desired data, one less efficient solution that students should be able to find by themselves with the acquired knowledge, is to use a Switch associated to the appropriate internal rotation sensor. It is important to leave the students find this solution, or at least part of it.<\/p>\n

\"Line<\/a>
Line follower program with a strategy to avoid null rotations, with file access blocks and a switch associated to the internal rotation sensor<\/figcaption><\/figure>\n

A more efficient strategy to record only the significant data is to use the internal rotation sensor logic plug out \u201cYes\/No\u201d specifying the desired rotation degrees threshold, for example bigger than 3, and connect this data plug out to a logic switch<\/em> data plug in to record in the file only the values bigger than the specified threshold.<\/p>\n

Finally, to connect the text of the \u201cFile Access\u201d block with the motor rotation degrees we need to \u201ccross\u201d the switch, something that can be only accomplished by unselecting its \u201cFlat View\u201d.<\/p>\n

Tip<\/strong>: Switches can be visualized either with a split sequence beam under the so called \u201cFlat View\u201d or in the shape of \u201cTabs\u201d. To wire from a block located outside the Switch to a block  inside the Switch, the \u201cFlat View\u201d needs to be unchecked to enable the “Tabbed View”.<\/p>\n

\"Line<\/a>
Line follower program with file access blocks, internal rotation sensor blocks and efficient strategy to avoid non-significant rotations<\/figcaption><\/figure>\n

But why is this programming strategy more efficient than the previous one? This program is more efficient than the previous one because it writes to the file at the end of the Switch branch after activating the opposite motor, and because it does not duplicate blocks unnecessarily.<\/p>\n

Once the program is completed, compiled, and executed by the robot, the file \u201cmotor.rtf\u201d, corresponding to the data logging record, only shows the rotation degrees accomplished by each motor while following the line.<\/p>\n

\"\u201cmotor.rtf\u201d
\u201cmotor.rtf\u201d file contents only with relevant rotation degrees of each motor<\/figcaption><\/figure>\n

Is there anything that calls the attention in this file? Why is there a point where the rotations are bigger than in the general case? Can we identify this point on the field? This example, run on the challenge exercise field, where the robot follows the line on its internal side, and during a line following time that allows to surpass the first intersection line, numerically illustrates the point where the robot crosses this intersection line. This point corresponds the biggest rotations, that is, B=75 and C=172 to recover the line following path.<\/p>\n

Thus, we have just designed a strategy to make a two-step simple line follower with one Color sensor more intelligent, by tracking each motor\u2019s rotations, so as to make it capable to knowing when it encounters a line intersection!<\/p>\n

Exercise 2. Data logging: A graphical debugging strategy<\/h2>\n

This exercise uses the Data Logging block to record the sensor readings of Color and internal rotations associated with motors \u201cB\u201d and \u201cC\u201d in a file with an \u201c.rdf\u201d extension.<\/p>\n

To start coding, create a new program, add the basic line follower algorithm including data logging blocks, as shown below.<\/p>\n

\"Line<\/a>
Line follower with \u201cData Logging\u201d blocks to start and stop data log recording<\/figcaption><\/figure>\n

When configuring the Start Datalogging block it is important to pay attention to different aspects.<\/p>\n

You need to give it a name that will be used later to stop the datalogging with a corresponding block, in our example \u201cmotor\u201d. You need to define the duration to \u201cOn\u201d to pass the time control to the Loop that will repeat for 9 seconds \u2013this allows robot line following to start immediately after the datalogging begins\u2013, and the rate with the number of samples per second to 10.<\/p>\n

Finally, we have to configure the desired sensors\u2019 data capture. First we have to add the Color sensor and after this the internal rotation sensors associated with each motor. We can add sensors by clicking on the upper right \u201c+\u201d sign icon of the block. Note that the internal rotation sensors units are degrees and not rotations, later on we will see the difference.<\/p>\n

Connect the EV3 brick to the computer to download the program, and make the robot execute it, having previously disconnected the robot from the computer. You need to place the robot in the same field of the previous challenge exercise to be able to compare the results.<\/p>\n

Once the robot has completed the line following exercise, connect it to the computer, and from the Experiment interface, open the generated \u201c.rdf\u201d file, \u201cmotor.rdf\u201d, by clicking on the up arrow icon, in the application lower right corner, to do a file \u201cUpload\u201d.<\/p>\n

We will see that the window allows loading the \u201c.rdf\u201d file from the computer or directly from the EV3 brick, by selecting it from the list. In the lower part, the window shows the available files to select. In our case, select the file \u201cmotor.rdf\u201d.<\/p>\n

Tip<\/strong>: The \u201c.rdf\u201d files not only can be opened and deleted from the experiments graphical interface, but also, you can manage them from the EV3 window, as we have done in the previous exercise with the text file. The \u201c.rdf\u201d files are binary files that can be only opened from the EV3 Software. These files have a specific structure that allows making an \u201cintelligent\u201d reading of them.<\/p>\n

\"\u201c.rdf\u201d<\/a>
\u201c.rdf\u201d file with rotation sensors units expressed in degrees<\/figcaption><\/figure>\n

This first datalogging file shows the data recorded from the Color sensor in red and the rotation sensors in purple. To better read the graph, we can change the color of one of the rotation sensors by clicking the  corresponding color square.<\/p>\n

To better differentiate the motors’ movements and to obtain more relevant differences, we have chosen degrees instead of rotations, but why? Because 1 rotation is equivalent to 360 degrees, that is 360 units, thus the differences are bigger. As you can see, the system chooses automatically this option.<\/p>\n

We can select the units, rotations or degrees, on the left side of each curve of the \u201cDataset Table\u201d tab. We can also choose the default desired units in the \u201cData Logging\u201d block attribute definition, as explained previously. It can be interesting to let the students test different units to better adjust the curves visualization, and take into account that you can even manually modify the coordinate system minimum and maximum values by writing the desired value in.<\/p>\n

It is important to make the students interpret and explain the generated graph relating to the different curves. First we can observe the Color sensor readings in red, which range from black to white (see the reflected light intensity in the Y-axis). One first question is why are the lines inclined in a \u201ctoothed saw\u201d shape and not vertical? The answer is that near the threshold between the black and white colors, the color sensor detects progressive changes of the reflected light intensity while the robot goes forward from one color to the other. See Exploring Thresholds<\/a> for more about this topic.<\/p>\n

In relation to the motor curves, we can ask the students why the corresponding curves have an \u201cS\u201d shape that tends to go up. The answer is because, on the one hand, when a motor starts the other one stops, and on the other hand, the rotation sensors accumulate rotations, and thus, the line has a positive trend.<\/p>\n

Finally, if we observe the color sensor curve, we can see that up to a certain point there is a higher peak. What does it correspond to? This peak corresponds to an intersection on the line the robot follows, that in our case, it is the field middle line of the challenge exercise. If we pay attention, even if it is a little bit hard, we can also see that the motors\u2019 rotation degrees, especially the \u201cB\u201d ones in orange color, look also bigger in the vertical of this same point, which approximately corresponds to the second 4.5.<\/p>\n

To finish the exercise, what do we need to do to obtain only the rotations the robot does when changing the color in the loop iterations? To obtain only the rotations that each motor does in each Switch phase inside the Loop, we only need to reset the internal rotation sensors after stopping them inside each branch of the switch, by using the corresponding internal Rotation Sensor block.<\/p>\n

\"Line<\/a>
Line follower with data logging blocks and reset of each motor internal rotation sensors<\/figcaption><\/figure>\n

Before executing this exercise it can be interesting to ask the students what will be the differences between this new graph and the previous one. As we can see below, the graph corresponding to the new program allows  much better interpreting of the results obtained.<\/p>\n

\"\u201c.rdf\u201d<\/a>
\u201c.rdf\u201d file resetting the motors internal rotation sensors and with units expressed in degrees<\/figcaption><\/figure>\n

Given that the motor turn is reset between each line follower Switch branch of the Loop, the previous graph shows only the degrees turned by the motors inside each branch, which is exactly the information we are interested in to find out the point of the field where a line intersection lies.<\/p>\n

At this point it is interesting to ask the students to interpret the results. In the graph we can see that the motors rotation degrees also show a \u201ctoothed saw\u201d shape curve and that the curves of each motor alternate, given that when a motor runs the other stops (or almost, because it can slip), corresponding to the cycles between the field background white color and the black color of the line to follow.<\/p>\n

Finally, we can observe that the peaks corresponding to the internal rotation sensors do not correspond exactly with the peaks of the color sensor, why? This is due to the robot inertia and to the program and sensors readings execution speed, when each motor is reset it is just when the color changes according to the specified threshold between white and black. Thus, the peaks of each rotation sensor tend to coincide with a point previous to the \u201crise\u201d or \u201cdescent\u201d of the color sensor curve, that is, with the threshold between white and black. If we take many more samples per second this fact would be more obvious and, in fact, if the program was very fast, the internal rotation sensors curves will show a completely vertical descent, because the motors stop at this point.<\/p>\n

Another fact that we can observe in the example is that Motor B seems to make a little bit more rotation degrees than Motor C through the whole robot\u2019s path. Why does a motor seem to turn a little bit more than the other one?<\/p>\n

Assuming the robot is following a straight line segment, motors should turn equally and even if the difference is very small it can happen that a motor turns more than the other one due any of a number of factors. These might include robot geometry, or if the color sensor is not well located in the middle of the robot, or if the robot weight is not evenly distributed, or if the wheels are not well built and arranged at the same distance from the robot axis, or if one wheel slips more than the other one on the field surface (the one that moves as well as the one that should be stopped, in fact we have seen that a supposedly stopped wheel can actually turn backwards\u2026), or if the field surface is not clean enough or if it has an unnoticeable lean towards one side. If the differences are big, you can revise the robot construction and repeat again the experiment, and probably you will obtain a slightly different result with very similar rotations between both motors. This is a very important point to take into account, especially when going to a competition or when high precision of movements is required.<\/p>\n

Finally, the Experiment mode of the EV3 Software also provides a very interesting interface tool that allows querying the sensors values at a specific point, in our case the position around the 4.5 second where motor \u201cB\u201d rotations are the biggest and which corresponds to a line intersection on the line the robot follows. This tool can be accessed from the menu icon \u201cAnalysis\u201d with the option \u201cPoint Analysis\u201d.<\/p>\n

The line corresponding to the analyzed point in a specific time can be dragged to the desired point, in our case near the second 4.5. We can also name it, generate as many analysis points as we want, and save all these analysis points with the experiment for future use. The Experiment interface also offers other tools like an area analysis tool, a magnifying glass, to zoom into the image on a specific area, and a prediction tool, all of them very easy to use and that can be very helpful.<\/p>\n

Challenge: Get to the end of the cage!<\/h2>\n

As we have seen previously, when following the line, the robot makes small movements to the right and to the left. When it encounters an intersection, however, to continue following the branch of the line to follow, one wheel does more rotations than those needed to follow the straight line, while the other wheel remains stopped. To control the global robot movement, we can detect when this occurs, stop the line following, turn the robot to the appropriate side (the one towards this intersection) and continue following the line. To do the challenge exercise, we only need to monitor one wheel. In our case, we only need to control motor \u201cB\u201d rotations on the right, since the robot follows the black line on the left, that is, in the interior side of the field, and when finding the intersection line that goes to the left, wheel \u201cC\u201d stops and wheel \u201cB\u201d does more rotations than when it follows the straight line, to find the \u201cwhite\u201d color.<\/p>\n

As we have seen, to control a motor\u2019s degrees or number of rotations, the internal rotation sensor block provides a way to compare the rotations with a specific threshold, 60 degrees in our case. If the rotations are bigger to the activation or threshold value, then the attribute or plug out \u201cYes\/No\u201d of the data hub will return the value \u201cTrue\u201d, which we can connect to the loop logical plug in, using a data wire, to stop the line follower by getting out of the loop.<\/p>\n

Tip<\/strong>: Remember that to connect a block logical attribute, which is inside a Switch, to the logical connector of a Loop that contains the Switch, “Flat View” needs to be unselected. Alternatively you can use a \u201cLoop Interrupt\u201d block to force the exit of an unlimited or forever Loop.<\/p>\n

Note that the rotation sensor needs to be reset after reading the supervised wheel rotations, for each line following cycle. To count the rotations, the rotation sensor needs to be placed just after stopping the supervised motor and before starting the opposite motor rotations.<\/p>\n

\"Line<\/a>
Line follower that stops when detecting a rotations threshold bigger than 60 degrees in the internal rotation sensor associated to motor \u201cB\u201d<\/figcaption><\/figure>\n
\"Line<\/a>
Line follower alternative that stops when detecting a rotations threshold bigger than 60 degrees in the internal rotation sensor associated to motor \u201cB\u201d, using a loop interruption<\/figcaption><\/figure>\n

At a global level, to do the challenge exercise we need a line follower every time there is a direction change or a line intersection. To get over the points of direction change we only need either to execute some rotations in the appropriate wheel to turn the robot and start a new line following path (making a point turn with a stopped wheel), or to go forward to surpass the intersection line (in this case since the robot will have turned, we only need to make it turn to the opposite side to recover the robot direction, surpassing the intersection).<\/p>\n

Observe as well that, as a function of the robot\u2019s turning direction, sometimes we need to monitor the right wheel and sometimes the left one, as shown below. For this last point, we can ask the students what needs to be changed in the program to make the robot follow the line from the other side.<\/p>\n

\"Challenge<\/a>
Challenge exercise programming using a simple line follower with rotations control to detect each intersection line<\/figcaption><\/figure>\n

What are the advantages of using a rotation sensor to do the challenge exercise? The big advantage of this solution is that it can even work if we modify the size of the field or the robot starting point. Thus, this solution is very generic and robust although a little bit complex compared to other straight forward solutions. What disadvantage can we observe in the previous program? What we can see is that the program duplicates the line follower code after each intersection. Further on we will see how we can encapsulate this code in a user block or subprogram to make it much more compact and efficient.<\/p>\n

Acquired knowledge<\/h2>\n

Datalogging, either by writing to a file directly using the File Access block or through the Data Logging block, not only allows us to record data dynamically while the program executes, but also allows us to observe in detail, study, and understand how a program works, as well as providing a very good method of program debugging. In addition, students learn how to use Logic Switches, how to pass information between program blocks using data wires, how to access and manage the EV3 brick’s memory, and how to interpret numeric data in tables and graphs.<\/p>\n","protected":false},"excerpt":{"rendered":"

In this unit, we will use the datalogging tools available in the EV3 Software to study the internal operation of a two-step simple line follower with one Light sensor. It is a very simple but powerful example that can help students to understand and to correctly program other situations. One of the most important functions […]<\/p>\n","protected":false},"author":412,"featured_media":9768,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[38,99],"tags":[143,74,73,87],"_links":{"self":[{"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/posts\/9762"}],"collection":[{"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/users\/412"}],"replies":[{"embeddable":true,"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/comments?post=9762"}],"version-history":[{"count":1,"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/posts\/9762\/revisions"}],"predecessor-version":[{"id":14314,"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/posts\/9762\/revisions\/14314"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/media\/9768"}],"wp:attachment":[{"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/media?parent=9762"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/categories?post=9762"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/legoeng.local\/wp-json\/wp\/v2\/tags?post=9762"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}