Meena - the latest Google ML Chatbot


There's been quite a bit in the press recently about Meena, Google's new chatbot powered by a "Evolved Transformer seq2seq" neural network model which is even more powerful than the recently touted GPT2. The bot has been trained on "341 GB of text, filtered from public domain social media conversations".

Whilst the results for general conversation seem impressive, I can't help but think that this is a bit of a dead end as the bot really has no idea and is in no way directing the conversation, its just coming up with the "best" response to each input, albeit taking the conversation to date into account. It just doesn't seem to have any intent or goals of its own.

More interestingly though Google has been measuring the performance of the bot (and using that as part of its reward function) with just two variables - how sensible is a reply, and how specific is it. Lots of bots try to go for sensibility and the expenses of specificity, they come back with general statements ("why do you say that") rather than actually trying to drive the conversation forward. For a while now we've been using Grice's maxims as a way of assessing bot performance (quantity, quality, relation, manner) which aim to cover similar ground, but the Google model may be even simpler and sharper - particularly when trying to train a human to do the assessing!

Samsung Neon


At the CES show Neon (backed by Samsung) showed off its Neon virtual humans. Press response (Engadget, Wired) was somewhat muted and sceptical, the main takeaways appear to be:

- Visuals are good, but really just a version of deep fake, and with uncanny valley issues
- Voice (TTS) is poor quality
- Chatbot engine is pretty basic
- Much of the viral video (eg still above) is actors, not even the Neon avatars!

So, see how far they've got by CES2021 then!





Guest Post: Virtual Humans - 40 Years On - A Different View


In a previous Guest Post the issue of virtual humans and belief was discussed. Here is another guest post by an erstwhile colleague, Mark Childs, in response.



Outcome 4: There’s no significant breakthrough, no significant social or economic reform. Everything just carries on as it has, just getting a bit more rubbish each year.

The old man struggled to keep standing as the queue progressed slowly forwards towards the
counter. At the front an automated teller scanned irises, deposited a series of blocks of texturised
vegetable protein from a slot. He hadn’t been able to afford the premium subscription, so this was
the unflavoured variety.

Dropping the tray immediately in the recycler, he secreted the blocks in one of his pockets. He’d eat
them later once back in his room. The sound of the refectory was deafening, he couldn’t bear to be
there for more than a few moments. It was against the rules to eat in his cubicle, but that was his
only private space. His home.

The world struggled under 10 billion lives. 10 billion squeezed into the remaining habitable areas.
The United Kingdom had been lucky. So was the rest of Northern Europe, but everything below the
50 th parallel was an overheated wasteland by now. The hydroponic farms and algae vats kept
everyone fed, but private housing was only for the wealthy. Parkland was given over to automated
farming machines.

AI kept everything running at an optimum level, but the infrastructure still creaked under the strain.
One of his children had been due to visit from the South-Eastern community, but after a day of train
cancellations he’d only travelled 30 km and had to turn back or risk being caught outside after
curfew.

Back in his cubicle he emptied out the TMP and began to munch through it. From his window he
could just about see the tips of hills beyond the grey ashcrete and ferrock landscape. Sitting on his
narrow cot if he propped himself up at the right angle he could lie and look out at the thin line of
green. From under his bed he pulled out the tray of books. Proper paper books. His last mobile
device had died ten years earlier. There would be no more. All the trace elements used to
manufacture them had been depleted years before and anything electronic was now too expensive
for private ownership. Apart for the very rich. All computing power was spent on just maintaining
the system.

He had begged to be allowed to keep a small collection of books. He’d chosen the smallest ones he
could, so that he could bring as many as possible before the rest went to be recycled. That had been
a mistake. His eyes were now too weak to read them, and it would be months before it was his turn
to get the tightly rationed prescription glasses he needed. He caressed the pages, imagining the
words and then looked out again towards the hills. Tomorrow he would head towards them. And see
how far he got.

Outcome 5. An image of a very specific tech, for a very specific purpose.

“I’m afraid, Dr Page, that we’re going to have to upgrade your husband.” The medic looked at the
elderly woman seated on the other side of the desk, noting the concern on her face.
“If you can, you’ll achieve something I’ve not been able to do in 45 years of trying,” she replied.
“What did you have in mind?”
“It’s his vascular dementia. It’s getting to the point where the AImenuensis can’t compensate for the
effects.”

The AImenuensis was an augmented reality device that people wore at all stages in their life. It
recorded all of a person’s interactions and acted as a support for them; reminding them of meetings,
prodding them with the correct things to say, recalling the names of people when they met them,
and how they knew them. The simplest just kept up a constant stream of information through an
earbud. The more complicated appeared as an embodied agent, overlain on their field of vision
through intelligent contact lenses. You could always tell when someone was relying on an
AImenuensis. If you greeted an old acquaintance in the street there would be a moment of
blankness, they would look off to one side as if consulting an invisible person, then you would see
dawning realisation and they would nod in understanding, and then reply with “Julian, how good to
see you, it must be five years. How is Sandy and the legal business?” or some such.

For older people, the AImenuensis was much more proactive. It would prompt the user with “you’ve
told them that already” or “that’s not the appropriate thing to say these days” or “no, that was the
character in the last episode who shot the detective, the one who was investigating the off-planet
arms deal” or similar. As someone’s memory worsened the information got more basic. This is your
son. This is where you live now. That person is dead.

But for the Dr Page mari, even that was no longer any help.

“I’m afraid that he’s not able to retain information long enough to make sense of his AImenuensis
any more. The degeneration is too far gone. I’d recommend an implant. A yottaflop interface.”
Dr Page femme nodded. A yottaflop interface. A device with a processing power of one septillion bits per second, implanted directly into the brain. This could link to sensory input for taste, touch, sight, hearing, heat, creating an illusion indistinguishable from reality.

“Why a YI?” Page asked.
“YI pet,” the medic explained. “You told me your husband was asking yesterday where his cats are.
Which have been gone at least 30 years?” Page nodded. “Even though I believe they may have a
therapeutic value I’m sure we couldn’t get a licence for one past the carbon control commission, let
alone two, but this would be an option.”

Waking from sleep, the old man touched his head where the surgical tape was irritating his scalp.
The room was unfamiliar, he thought he recognised the elderly woman asleep on the chair next to
him, but he couldn’t place her. The screen in front of him was odd. It had no glass to it, it just seemed
to hover in mid-air. He felt bewildered, intimidated. Adrift in a strange world. Then he felt a warm,
furry head nuzzle into his shoulder and he relaxed. He had everything he needed.

Outcome 6. We all get everything we wanted.

Carbon-fibre muscles twitched beneath my plastic-alloy skin. Light-catcher wings unfurled I caught
an uplift and soared. Flicked out, downloaded to the hoverblimp to watch my body fall in third
person, then back to pick up the plummeting silverfish-angel figure and felt the gees as I pulled out
of the dive.

There was a rheumatic-carcass of a flesh-body rotting somewhere. I’d ditched it 15 years before on
my 70th birthday. A new one was being grown in a vat somewhere and would soon be ready. Back to
four limbs. Back to the ground. A mundane. I’d been lucky. Not every mind made the trip from meat,
to code, to metal. It was a fundamental plasticity that enabled it to happen. Too trapped in the
physical, you couldn’t do it. Too afraid of losing identity, you couldn’t do it.

But flow, and fly, and give yourself up to losing everything you were, and you could slip from form to form, reality to reality. And when the soulcatcher tech was perfected, there was a whole new realm
of expertise created, like a lottery handed out randomly to a handful.

The mycelium towers beckoned and I slipstreamed towards them. Towering strands of fungus
filaments. Flash-consciousness folded my mind to that of the AI running them. Slow ponderous, it
spoke of wind and growth and temperature fluctuations. Bored I switched to the control of the
fusion station, then out to the weather control net, then bounced around the orbital platform AIs.
Mars ship inbound. Clavius shuttle. People. Bustle. The AIs whispered to each other, ignoring the
intruder but making processing space. I promised to let them borrow my flesh-body once it was
ready. In return they let me occupy the interstellar ship in the dock for a few milliseconds. It lay in
dry dock ready to launch, nose pointed towards the stars.

I stretched tapping into internal sensors, feeling the hard vacuum, showering in a neutrino burst.
Exploiting multidimensional proprioception, inhabiting fractal body schema. Novae called to me and
intoxicated by the sensation I jumped again. Finding a cloud of nanites floating off the pacific coast.
Then the city AI for SF, alongside several others entities similarly polymorphing. I felt the buzz of the mycelial towers still in my head. I’d picked up some stray code from there. The echoes of the
neutrinos still hyped me as SF beckoned in riot of colours, vehicles swerved, lights flashed across
skyscrapers as I slam danced the cosmopolis. A school of factory whales, scooping up plastic detritus
and making new whales swam and I split across them feeling myself spread across multiple bodies,
then merged again. Something was happening. To my mind. Bits were left as I leapt again and again.
Left behind. An errant error on a starship, a metal whale regurgitated. Polyvinyl ambergris. Flying
silverfish angel. Circling fungal towers. Bouncing. Jum. Ping. A tramcar signal flicked. Broken shades on a sidewalk. Registered a presence. For a second. The code. My code. Loss. Static. Where was? I? I? I? 49 3F. 49 3F. 0100 1001 0011 1111. 0000 0000 ….0 …..

New AI Survey


In the book we have what we call the "AI Landscape" diagram, mapping a variety of systems against how "sophisticated" they are, and how "human" they seem.



We plotted a variety of real and imagined computer systems, robots and AI on the diagram, but that was just our assessment (and deliberately not shown above!).

Now we're giving YOU a chance to tell us where you think these systems should be on the chart, as well as how much they reflect your idea of what AI is, and what elements are important in a virtual human.

Just fill out the survey at: https://www.surveymonkey.co.uk/r/W6XM537

AND there's a chance to win a copy of the book if you leave us your email at the end of the survey.

The survey is open until 30th November, and we'll be publishing results here in Jan 2020.

We look forward to hearing your views.

The After Wife


After the recommendation from Dr Elaine Kasket (author of All the Ghosts in the Machine: Illusions of Immortality in the Digital Age) at our book launch I picked up Cass Hunter's The After Wife to read whilst on holiday. Whilst not my normal sort of book it is a very C4 Human's take on an android copy of the main character who is there to support and comfort her family after she dies early, but expectedly. As with Humans the android shows a lot of hesitant, halting mechanical action, and equally cumbersome sentences, whilst at the same time being able to make long speeches and cogent arguments. Light on detail (understandably) when it comes to how the android's mind was made, other than lots of access to physical diaries/memorabilia/electronic data and watching the target character. There's also an interesting bit of comparison between the android and the elderly mother suffering from dementia who has not problem in seeing the android as just another nurse/carer.

A good read generally though, but as ever the obsession with the physicality of virtual humans.

David at CogX


David was one of the Keynote speakers in the Lab-to-Life strand at CogX - the huge AI conference that took place in a very wet London this week. David spoke about Daden's work on Virtual Life Coaches and Virtual Personas, and touching on Digital Immortality at the end. As the compere said, if people taking photos of the slides is a modern indicator of a good talk then this was a well received talk indeed. Other comments included "excellent",  “really interesting” and “flipping fascinating”!

And this is how wet it was: