When we talk about robots taking over jobs, people immediately think about the human-like devices that look like small children, attractive women or super-strong cyborgs that we’re starting to see more of.
By Kathy Gibson
These humanoid robots might be the face of robotics but they are a long way from taking anyone’s jobs.
In fact, their function is little more than to make people feel comfortable with using technology interfaces, says the man who operates one of the robots that South Africans are becoming familiar with.
“These humanoid robots are not about to take over anyone’s jobs,” says Chris Mallis, a developer at Saucecode. “The kind of automation that will do our jobs wouldn’t necessarily have a human-like interface: it’s more likely to be automation scripts that will do the repetitive processing stuff that can be readily automated.”
The humanoid robots we see are used by companies as communication channels to help and delight customers.
“They can be programmed to do or say certain things, but they can only do those things they are programmed to do – to execute the scripts that we have written on the back-end.
“Mostly, they are there to give technology a more human face, to help people bridge the gap in understanding what computers can be programmed to do.”
And they certainly do that: Mallis says people relate well to humanoid robots. “They love them, and find them very engaging – it blows my mind when I see the feedback. They certainly work in helping to humanise technology.”
The robots that Mallis works with are programmed in Python, and software is exposed via application programming interfaces (APIs). Sensors let robots react to the world around them, and the APIs call functions that allow them to do different things, he explains.
“You can pretty much make these robots do anything. Anything you can do on a normal computer you can do on a humanoid robot. But people who may not have their own computers, or are intimidated by technology, might find it easier to interact with a humanoid robot.”
Mallis explains that people often confuse robots and bots – or robotic process automation – and this is the technology that may well affect people’s jobs.
However, the jobs that computers will be able to do are typically going to be repetitive jobs that can be easily automated; and will theoretically free up skilled people to do more interesting work that should have more value for their companies, he adds.
Many of the humanoid robots people are familiar with are quite obviously not human: they are markedly smaller than real people and typically have cartoon-like features – cute and likeable but not particularly life-like.
Sophia, on the other hand, is a robot that is designed to look human: it has human features that are expressive; a human-like voice; and is the same approximate height as an average woman.
The recent Davos of Human Capital event hosted by Duke CE saw Sophia displayed by her developer, Dr David Hanson, founder of Hanson Robotics.
As robotics becomes more pervasive, it will become more of a feature in the workplace, with humans having to work alongside robots. This is driving a fear of robotics, which humanoid robots like Sophia are designed to help dispel.
In fact, Sophia was always meant to be more like a science fiction character to help people change the way they think about robots, bots and artificial intelligence, says Dr Hanson.
“I am drawn to how ideas can change the world; drawn to dreams,” he says.
As part of this quest, Dr Hanson started questioning disciplinary boundaries – for instance, the fact that science and art are seen as different. “This raised questions about why things are separate because: instead of propelling knowledge forward, this creates a cult of what we think we know.
“The science fiction writers were thinking about how technology could change the future – and scientists would sometimes admit to getting inspiration from science fiction.”
This dreaming and striving to make the apparently impossible, possible has driven Dr Hanson’s career.
“My uncle believed artificial intelligence (AI) could change the world. But then it was shown that a two-dimensional neural network could not do what we wanted AI to do – and so it was believed AI was a dead end.
“But science is about the unknown, not the known; and engineering is about discovering what’s next.”
Dr Hanson was motivated by science fiction and animation. As a child he enjoyed drawing, creative writing and poetry.
“As a teenager, I realised that if we could develop a true human thinking machine it could help to complement problem-solving and invent the next generation of intelligence. If we dedicated ourselves to that we could solve big problems.
“If I hadn’t been thinking about engineering as an expression for the creative arts, I would not have been developing robots and collaborating with scientists. Art and science are like oil and water sometimes, but when you can mix them you get something special.”
The development of Sophia was meant as a work of science fiction to provoke people to think about AI – what it can do and what can go wrong, Dr Hanson says.
“I want to challenge the issues, find a way of making a biological machine. In science fiction, the robots seem to be alive – if it doesn’t resemble a human being, it is just a machine.”
The hardest part of Sophia’s technical development was the human-like cognition; on the aesthetics side, creating appealing facial expressions was difficult.
“Merging the arts to solve problems with engineering is hard,” says Dr Hanson. “We had to think about the physical challenges, about solving the AI issues and visible manufacturing, and this had to be combined with a walking platform.”
Cultural collision remains an issue. “In the world of AI and robotics, robots must look like robots – not like people.”
Dr Hanson stresses that Sophia is not sentient, but the platform incorporates machine learning and so the device can appear to respond in a human-like way.
Mohammed Amin, senior vice-president: Middle East, Turkey and Africa at Dell Technologies, believes that robots – or robotic process automation – will release humans from tedious, repetitive jobs and allow them to be more creative.
“By 2030, 80% of the job market won’t be the way it is today,” Amin says. “This doesn’t mean there will be fewer jobs; but that the jobs will be different.”
He points out that, not many years ago, people went to work where they accessed a workstation and the data on it. When their work was done, they left their desk. Today, the same worker has a laptop that allows him to take his work with him wherever he goes.
Now, the major change in the way people work is immersive data. “The data is all around you and you need to analyse it, make sense of it. But no-one can do this, so we have think of another way to do it.”
This is where AI, or bots, come in, Amin says. “Imagine if you could download 7 000 years of experience and culture into a person’s brain – imaging the intelligence and smarts of this person.
“AI is about creating something that can analyse all of that data. For the first time, we are creating something that is more intelligent then ourselves.”
The prospect of thinking machines – and particularly if they look human – is a scary one, Amin says. To prepare for the time when the digital economy becomes a reality, he says countries should be looking to manage their data laws now.
“This is very important, as the future is going to be about data laws and the digital economy.”
He adds that, within the next decade, 10% to 15% of world GDP will be based on the digital economy. While it will impact some emerging markets negatively, others are looking to the digital revolution to transform their economies.
This might seem like a contradiction, but farming, logistics and manufacturing can all benefit from digital technology, Amin points out.
“Yes, the bottom line is that you need to sell cocoa beans, but in reality this could be an e-farm that uses AI help you increase your yield. Drones can map the topography of a farm, to make sure the planting is most effective; they can also be used to test every plant and water them accordingly, saving as much as 60% of the water currently used.
“Even agrarian economies will be digital agrarian economies.
“At the end of the day, all business will be driven by the software.”
And this change isn’t going to take long to reach us, Amin adds. “Just look back in history. Technology improves by 10-times every five years. So in 10 years’ time, it will have improved 100-times over what we have today. And in 15 years it will be 1 000-times.
“In fact, we believe that the next 10 years in technology will see the most dramatic and drastic changes yet – equivalent to the changes we have seen from the beginning of the computer era.
“This could take away from our humanity, so we have to embrace it and try to influence it.”
The next stage of AI and RPA is already gaining traction – and South African developers are among the leaders in this technology.
For instance, Saucecode is developing its computer vision system that can “read” a monitor – or any other data source – to identify people, objects, logos and more from a video stream.
“So you could run video input, or a picture, and provide an algorithm to identify when that object appears, plus any analytics associated with it,” explains Saucecode’s Mallis.
The technology can go further and “read” forms. “Using optical character recognition, it can identify where different pieces of information are on the form – even if they are in different places. So it can find the box for ‘name’ and transcribe the characters that make up the name – even if they are in handwriting.
“Today, these forms would be entered by an administration clerk. But with computer vision, you could simply scan the form in and all information would be captured, correctly, in the database.”
There are systems available today that capture information from forms, but they typically rely on the various fields being in the same place all the time. “Because our system can recognise where the box is, it can still find it regardless of where it is on the form,” Mallis explains. “In fact, the system does it just like a human being would.
“It’s a new way of automating repetitive tasks.”
This computer vision is part of Saucecode’s Novabot RPA product that is able to automate just about any repetitive task.
Barry Buck, lead innovation specialist at Saucecode, explains that RPA is not a new phenomenon. However, the systems typically in use today are geared towards automating the processes on green screen terminals in large mainframe-based enterprise – and they are both complicated and expensive.
“Being able to automate tasks that require warm bodies is almost a holy grail for IT,” Buck says. “Humans are expensive, they make mistakes and they can’t work 24/7.”
Enterprises are looking to modernise their solutions and add AI – but they have to first find a way to automate many of the processes.
“I realised that, to make RPA work in today’s technology, we had to start with computer vision rather then trying to retrofit it – we had to change the thought model,” Buck explains.
Using Google TensaFlow, and drawing from their experiences in the development of other products, Buck and the Saucecode team were able to build computer vision to the point where it now recognizes 60 classes of user interface (UI) elements.
Once computer vision was working effectively, Novabot could be developed to identify and record what happens on a particular computer, capturing the information from the front-end to create a task, and analyzing the data.
Buck explains that some processes could have taken three days to run – with a human being monitoring it the entire time. “Some people in legacy environments to do that: repetitive, boring work that has to be done.
“Now Novabot can take that over, and provide the human being with a simple overview of the results.”
Importantly, Novabot is simple and fun to use. “We already know how to use software, so why do people still make software that is complicated to use?” Buck asks.
In fact, Novabot has already taken over a human being’s job, after observing and learning the processes involved over two weeks.
“It is giving the person being replaced the time to learn and become more skilled at other jobs,” Buck explains. “This particular person has been requesting to move for a while. However, the job he does is important in business-critical operations and he couldn’t even take a day off without problems.
“Now we can ensure the business-critical operations are carried out 24/7 and the people who used to do then can grow in the profession.”
This kind of technology can help South Africa alleviate its skills shortage, Buck adds. “A lot of the skills we are short are in business-critical operations where a deep understanding of the technology is required – so there are skilled people doing boring work, and the companies that employ them haven’t got the luxury of letting them develop.”
AI, bots – and regular automation – is rapidly making processes easier and more efficient.
Niral Patel, country manager of Oracle SA, says the software giant is making a big bet of autonomous databases.
“Everyone is talking about autonomous cars, but it needs something to power, starting with an autonomous database.”
The value for customers ranges from easier operations, like patching databases or moving workloads. “These are simple tasks, but they require skilled people,” Patel explains.
“Where we are going is to a system with AI/ML that learns, provisions, patches, secures and runs the database. It will allow organisations to let skilled people do more interesting and strategic tasks, and really tackle the business problems.”
He points out that a lot of employees are kept busy doing mundane, repetitive tasks, and don’t get to the work they were employed to do.”
So far, about 5 000 customers around the world are piloting autonomous databases, Patel says. “In a world where acquiring skills is not easy, they are at a premium.”
He adds that AI/ML can be used to help prevent or recover security breaches as well.