Future Inventors – My LMD article on Team Sri lanka

SRI LANKA AT THE ROBOTICS OLYMPICS

BY Angelo Fernando

 

A short walk from the White House, the steps leading up to a neoclassical building where Robin Williams once performed spill over with teenagers in bright yellow and blue T-shirts. Using screwdrivers and wire, they are feverishly fixing their robots. It’s only 15 minutes before Round 1 of the two-day competition held in July – a global event drawing 163 teams from 157 countries.

The humidity in Washington D.C. hovers around 90 percent and Team Sri Lanka’s four students are sweating bullets. Huddled in a basement, and parked between Senegal and Sudan, their 20-wheel steel robot needs some repair work.

Why? The bot they had built in a classroom (so secretive was the project, they called the room ‘Area 52’) arrived with a warped axle and damaged omni wheels. The motor failed too, which is not an uncommon problem among teams here. In a few minutes, they must have their 23-kilogramme robot working. It is the ‘Olympics,’ after all…

Link to full article here.

Published in the Sept issue of LMD Magazine.

Self-driving carts – The downside of robotics

Robots are great until they carry out tasks that take humans out of the equation. Or when they attempt to use ‘data’ as a substitute for insight.

For this reason I am not exactly excited about self-driving cars – and I pass some of these each week in the Gilbert area. (Bummer! Uber’s autonomous vehicle met with a 3-vehicle crash last week) Besides the safety aspect, there’s the real long-term effect of erosion of jobs. Those jobs that involve routine manual tasks. Think of warehouse work, or on-demand ‘runners’ and movers that make a factory work.

As fascinating as this demo below seems, it’s the dark side of what robots could do to the workplace.

If there’s any upside of this, it’s that companies that defining this future are hiring people with emerging engineering and science (STEM) backgrounds. The company who developed this cart says it is hiring a ‘Computer Vision Scientist‘ – someone with math skills, and experience in LIDAR, radar, sonar, GPS etc.

I love it! The smart cart can ‘see’ and find its way through a messy warehouse. But it needs a scientist with ‘computer vision‘ in his/her title to bring such technologies to fruition. At least it’s a raison d’être for STEM education. People who can carry out cognitive, problem-solving tasks that bots cannot. Yet.

Robotics teams immersed in complex (timely) water challenge

So as #Flooding and #StormSurge is on everyone’s mind with havoc from hurricanes Harvey and Irma, it is unhappily timely that robotics in schools and clubs across the country are wrapping their minds around an H2O challenge. Specifically, ‘Hydro Dynamics.’

It’s this year’s theme for the FIRST Lego League that will culminate in tournaments between November and December. (Interestingly, the theme of the FIRST Global ‘Olympics‘ in July was H2O Flow ) Alongside the work on building and programming a bot to run missions, students must work on a research project. How water is sourced, conserved, distributed etc. They must also come up with a solution that ‘adds value to society’.

Right now there are a myriad of issues that experts and government officials are wrestling with. Could students hypothetically solve some of these in the future? Dean Kamen’s FIRST outfit has been doing an amazing job of using robotics to build a new cadre of engineers, designers, and problem-solvers.

As I watch my school team assemble the missions in my lab, it’s evident that each mission (built of Lego pieces) is more complex this year: There’s a ‘Pump addition’ mission, a Water Treatment model involving ‘Big water’, and others involving Pipe Replacement, and Sludge Removal.

Here is what the field mat looks like.

Chamath Palihapitiya could throw a wrench into AI heavyweights

It’s always good to keep an eye on what Chamath Palihapitiya is up to. He has been building a team of ex-Googlers, and is supposed to be after the next generation of computing. A $10 million startup, to be sure!

This could signal a lot of things, depending which pair of lenses you put on. It probably has a lot to do with AI – Artificial Intelligence. For instance he hired away eight of the ten people at Goggle working on a secret project involving a chip with AI. He has poked fun at Watson, the IBM cloud-based machine learning application.

Watson, as you might be aware turned tables on Jeopardy and Go (the 2,500-year-old game), but has machine learning entrenched in many sectors from genomics to industrial safety. Google’s machine learning project, known by its bland name, Tensor Processing Unit (TPU), is underway.

Pahlihapitiya talks of ‘probabilistic‘ software that is changing how we depend on devices – a great shift from ‘deterministic’ software based on “if-the” sequences. Watch how he explains how machine learning and II is transforming, and will up-end computing. I bet Watson took in every word of this.

 

 

Could Fitbit smartwatch take over iPhone territory?

Though I will never wear an activity tracker. I’ve been very curious about the smart wristband / ‘wearables’ business. Especially the territory Fitbit has been moving into.

Sure, most people will be awed by Fitbit’s ‘SpO2’ sensor, for instance. But despite the clinical USP (to keep tabs of ones oxygenated blood), there are some features that blur the lines between an activity tracker and a smart watch; a wearable that can make contact-less payments via NFC, minus a phone.

There’s also the music feature. A smart watch that could store music could be a game changer. With Bluetooth and WiFi (and GPS) who knows what territory it might lead this ‘wearable’ into? Will it motivate some to leave their phone behind? Would that mess with the iPhone eco-system?

The other reason I’m curious about this wearable is, I plan to use Fitbit as an example in an upcoming class. It’s a class about the Internet, and the connectivity it provides. And the hardware and software that run on the infrastructure students take so much for granted. Following up on last week’s look at Virtual Reality, nothing like bringing up the much-hyped Internet of Things.

Hashtag’s Birthday today – When lowly keyboard symbol became a celeb

Ten years ago, how did you refer to this # symbol? Most people I knew called it the ‘pound sign’ and we barely used it. Except for street addresses (it felt sleeker than the old-school abbreviation “No”) as in #33 Clifford Place.

Ten years ago, we had other things that grabbed our attention. FB had just bragged about 30 million users. Virginia Tech happened, the iPhone had been launched, and the global economy was teetering.

So what were you doing on Aug 23rd, 2007? Where were you working? How did you communicate? I am curious to see how my friends were doing when the hashtag emerged.

Another eclipse crowd-sourcing project captures sounds, not pictures

My wife and I were discussing the eclipse –her two-and-a-half year old Montessori students had been excited about it!– and wondered if animal behavior was being tracked.

So it was a pleasant surprise when I saw Ruben Gameros’ post today on FB about his participating in the Purdue University ‘sonic effects’ project. Their “huge, continental scale” project was to study how animals will respond to the solar eclipse. Using acoustic sensors (from Alaska to Puerto Rico) they invited citizen scientists to collect data –acoustic behavior of  birds, crickets, cicadas, and frogs.

Why record sounds? Purdue researchers are looking at if animals that are typically active during the day stop making sounds during an eclipse. Among other questions:

  • Are there patterns for birds, insects, mammals, amphibians, and even fish?
  • Are the changes in the circadian cycles different in coniferous forests, temperate deciduous forests, grasslands, and coastal ecosystems?
  • Is there a difference in behavior in the total eclipse zone compared to areas that are in the 90%, 80%, 70% and 60% or less zones?

Caltech had a different crowd-sourced project calling for eclipse-related animal behavior. It called for young scientists to make 3 observations during the event, and record it via a special App. It was supported by a teacher webinar, and a web-chat.

I wish students across the country would have been motivated to do more than look for the umbra and penumbra. Or dodge the event, entirely. (One school I know cancelled the viewing even after they ordered glasses.)

Google’s Eclipse megamovie worth watching

For those of us who cannot watch the eclipse today in North America, there’s a fascinating project that would document it. Google has worked with UC Berkeley (Eclipse Megamovie) and  has recruited 1,000 volunteer photographers and amateur astronomers for the event. Volunteers must download the Berkeley-created app for this.

The eclipse will last from 9.05 am Pacific, to 4.09 pm Eastern.

So for instance, in Scottsdale, Arizona (as is evident, we are outside the ‘path of totality’) the moon’s shadow will cover just 63.3% of the sun. It all begins at 9:14 am and will continue for 2 hours 46 minutes. Peak time of the eclipse will be 10:34 am.

I found it interesting to read that damage to the retina would only occur is someone looks directly at the sun before or after totality without the protective glasses. Thankfully, those who cannot watch the event live have the citizen-sourced megamovie.

Two flavors of ‘Ice Cream’ to the Space Station!

The Dragon capsule delivered several technologies and experiments (6,4000 pounds of it) to the International Space Station. But it also delivered ice cream to the astronauts on board. So what’s a few scoops, for those folks who travel at 17,000 miles an hour for several months!

Also, in a geeky twist, it is also delivering another flavor, so to speak: ‘ISS-CREAM‘ – the acronym for ISS Cosmic Ray Energetics And Mass. It is a balloon-borne instrument that “measures the charges of cosmic rays ranging from hydrogen up through iron nuclei, over a broad energy range.” Clear as mud. (a balloon carrying ISS-CREAM) But very cool, huh?

As for the docking, as I mentioned in a previous post about the robotic arm and the maneuver, it is pretty cool! Humans need robots – and some ice cream now and then.