Robots are great until they carry out tasks that take humans out of the equation. Or when they attempt to use ‘data’ as a substitute for insight.
For this reason I am not exactly excited about self-driving cars – and I pass some of these each week in the Gilbert area. (Bummer! Uber’s autonomous vehicle met with a 3-vehicle crash last week) Besides the safety aspect, there’s the real long-term effect of erosion of jobs. Those jobs that involve routine manual tasks. Think of warehouse work, or on-demand ‘runners’ and movers that make a factory work.
As fascinating as this demo below seems, it’s the dark side of what robots could do to the workplace.
If there’s any upside of this, it’s that companies that defining this future are hiring people with emerging engineering and science (STEM) backgrounds. The company who developed this cart says it is hiring a ‘Computer Vision Scientist‘ – someone with math skills, and experience in LIDAR, radar, sonar, GPS etc.
I love it! The smart cart can ‘see’ and find its way through a messy warehouse. But it needs a scientist with ‘computer vision‘ in his/her title to bring such technologies to fruition. At least it’s a raison d’être for STEM education. People who can carry out cognitive, problem-solving tasks that bots cannot. Yet.
So as #Flooding and #StormSurge is on everyone’s mind with havoc from hurricanes Harvey and Irma, it is unhappily timely that robotics in schools and clubs across the country are wrapping their minds around an H2O challenge. Specifically, ‘Hydro Dynamics.’
It’s this year’s theme for the FIRST Lego League that will culminate in tournaments between November and December. (Interestingly, the theme of the FIRST Global ‘Olympics‘ in July was H2O Flow ) Alongside the work on building and programming a bot to run missions, students must work on a research project. How water is sourced, conserved, distributed etc. They must also come up with a solution that ‘adds value to society’.
Right now there are a myriad of issues that experts and government officials are wrestling with. Could students hypothetically solve some of these in the future? Dean Kamen’s FIRST outfit has been doing an amazing job of using robotics to build a new cadre of engineers, designers, and problem-solvers.
As I watch my school team assemble the missions in my lab, it’s evident that each mission (built of Lego pieces) is more complex this year: There’s a ‘Pump addition’ mission, a Water Treatment model involving ‘Big water’, and others involving Pipe Replacement, and Sludge Removal.
Here is what the field mat looks like.
Though I will never wear an activity tracker. I’ve been very curious about the smart wristband / ‘wearables’ business. Especially the territory Fitbit has been moving into.
Sure, most people will be awed by Fitbit’s ‘SpO2’ sensor, for instance. But despite the clinical USP (to keep tabs of ones oxygenated blood), there are some features that blur the lines between an activity tracker and a smart watch; a wearable that can make contact-less payments via NFC, minus a phone.
There’s also the music feature. A smart watch that could store music could be a game changer. With Bluetooth and WiFi (and GPS) who knows what territory it might lead this ‘wearable’ into? Will it motivate some to leave their phone behind? Would that mess with the iPhone eco-system?
The other reason I’m curious about this wearable is, I plan to use Fitbit as an example in an upcoming class. It’s a class about the Internet, and the connectivity it provides. And the hardware and software that run on the infrastructure students take so much for granted. Following up on last week’s look at Virtual Reality, nothing like bringing up the much-hyped Internet of Things.
For those of us who cannot watch the eclipse today in North America, there’s a fascinating project that would document it. Google has worked with UC Berkeley (Eclipse Megamovie) and has recruited 1,000 volunteer photographers and amateur astronomers for the event. Volunteers must download the Berkeley-created app for this.
The eclipse will last from 9.05 am Pacific, to 4.09 pm Eastern.
So for instance, in Scottsdale, Arizona (as is evident, we are outside the ‘path of totality’) the moon’s shadow will cover just of the sun. It all begins at and will continue for . Peak time of the eclipse will be .
I found it interesting to read that damage to the retina would only occur is someone looks directly at the sun before or after totality without the protective glasses. Thankfully, those who cannot watch the event live have the citizen-sourced megamovie.
I liked the original Google Classroom, for how it simplified how a learners could belong to a ‘class’ even though they may not be in the same building. Or country.
But the latest improvements to Classroom take it further, letting anyone who plans to teach create a lesson and connect with students. I just created a class as an experimentt. It’s a class on Writing and Publishing — the basis for a project this summer.
Lots of potential in how they hand over the tools to engage students, and receive feedback.It’s evident Google is staking its claim on a sector ready for disruption. Especially since Khan Academy has prepared the ground for it.
As the New York Times put it, Google has practically out-maneuvered Apple in the education market. More than half the nation’s primary- and secondary-school students now use Google education apps, it says.
Washington DC’s humidity hovered around 90 percent when the competition began on 16 July. Team Sri Lanka’s four students were sweating bullets for different reasons. In a crowded basement, parked between Senegal and Sudan their 20-wheel steel robot needed some repair work.
The bot that they built in secret in a classroom in Colombo (they called it ‘Area 52’) arrived with a warped axle and damaged omni-wheels. Two hours before departure the airline forced them to repack the 23-kilo microwave-sized contraption into two boxes. The next day the motor failed –not an uncommon problem among teams here.
But they did take on the world! In this competition, designed by FIRST Global like the Olympics, each team worked in ‘alliances’ – groups of three country teams. It was fascinating to watch each team, battling cultural and language barriers (and jet lag and sleep) work through the constraints and perform. My family and I were so proud to be there supporting them.
They did quite well in strategy and design of the bot. In terms of rankings they were placed 138th out of 163 teams – beating the US, France, and Russia. When you consider they had just 9 weeks to prepare for this (many teams had at least 12 weeks), it was quite a feat.
Kudos to coach Dilum Rathnasinghe who took on such an unthinkable task. The team comprised: Ali Anver, Ishini Gammanpila, Vinidu Jayasekera and Akash Gnanam
Here are some images from the 163-team, 157-nation Robotics Olympics. Read previous post here.
Modeled on the Olympics, FIRST Global’s inaugural international robotics event began on July 16th.
This was the opening ceremony.