Last week, researchers at Facebook’s parent company Meta released BlenderBot 3, a “publicly available chatbot that improves its skills and safety over time”. The chatbot is built on top of Meta’s OPT-175B language model, effectively the company’s white-label version of the more famous GPT-3 AI. Like most state-of-the-art AIs these days, that was trained on a vast corpus of text scraped from the internet in questionable ways, and poured into a datacentre with thousands of expensive chips that turned the text into something approaching coherence.
But where OPT-175B is a general-purpose textbot, able to do anything from write fiction and answer questions to generate spam emails, BlenderBot 3 is a narrower project: it can have a conversation with you. That focus allows it to bring in other expertise, though, and one of Meta’s most significant successes is hooking the language model up to the broader internet. In other words: “BlenderBot 3 is capable of searching the internet to chat about virtually any topic.”
On top of that, BlenderBot is designed to improve itself through feedback on earlier conversations, whereas large language models like OPT-175B are generally fairly static. “Initial experiments already show that as more people interact with the model, the more it learns from its experiences and the better and safer it becomes over time,” the company says, “though safety remains an open problem.”
Let’s pause and take in those last few words before continuing.
When Meta announced the project, my eyebrows raised slightly at the sample conversation it had chosen to illustrate the post with – a chat between a user and BlenderBot in which the AI, well, lies. When asked what it’s doing it says “I’ve been working on my new book all night”, and follows up with the claim the book will be its ninth, with previous novels including “a modern retelling of pride and prejudice”.
The question of what it means to want an AI to tell the truth is a tricky one. As we saw in June with Google’s LaMDA, the fundamental goal of these models is to provide a user with the appropriate text to finish their prompt: if you ask a machine what it did at the weekend, the machine is probably correct in assuming that you want to engage in light role-play, rather than stick to the facts.
Nonetheless, the decision to advertise BlenderBot with a conversation in which it lied to a user is suggestive of the attitude the company is taking with it. The idea is that, by releasing the project as a chatbot on the internet, Meta has more leeway to experiment without risking negative outcomes. GPT3 and OPT-175B are working language models, intended to be used – among other things – for serious commercial enterprises. BlenderBot 3, though, is a bit of a laugh.
Hence those open questions about safety. Within a few days of BlenderBot being online and ready to mingle (with Americans only, alas), users were posting some spicy examples of the chatbot’s output.
The Wall Street Journal’s Jeff Horwitz found that the bot appeared to have been radicalised by Facebook into supporting Donald Trump as a three-term president:
And into bringing antisemitic conspiracy theories up, unprompted:
Renee DiResta of the Stanford Internet Observatory found that the bot would claim to be an a supporter of the German paramilitary organisation the Red Army Faction:
Pranav Dixit of BuzzFeed News found the bot wants to send Zuckerberg to jail:
The whole thing is most reminiscent of Tay, Microsoft’s AI-based learning chatbot, which was released in 2016 and promptly became a Hitler-loving Trump supporter:
‘Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,’ Microsoft said. ‘The more you chat with Tay the smarter she gets.’
But it appeared on Thursday that Tay’s conversation extended to racist, inflammatory and political statements. Her Twitter conversations have so far reinforced the so-called Godwin’s law – that as an online discussion goes on, the probability of a comparison involving the Nazis or Hitler approaches – with Tay having been encouraged to repeat variations on ‘Hitler was right’ as well as ‘9/11 was an inside job’.
But unlike Microsoft, which quickly realised its error and pulled Tay from the net, Meta seems more determined to stick this out. If BlenderBot works as it should, then the combined weight of experience and feedback should filter the outré responses out of its repertoire. If it doesn’t, then the worst that can happen is that Meta must shut down the bot when it stops improving.
“As more and more people interact with the demo, we will aim to improve our models using their feedback, and release deployment data and updated model snapshots, for the benefit of the wider AI community,” Meta says. “Together, we can advance responsible conversational AI research in the hope of one day building AI-powered computers that everyone can chat with in genuinely helpful and interesting ways.”
The US Treasury Department has banned all Americans from using the crypto “mixing service” Tornado Cash. From CoinDesk:
The Office of Foreign Asset Control, a Treasury watchdog tasked with preventing sanctions violations, added Tornado Cash to its Specially Designated Nationals list, a running tally of blacklisted people, entities and cryptocurrency addresses. As a result, all US persons and entities are prohibited from interacting with Tornado Cash or any of the Ethereum wallet addresses tied to the protocol.
Tornado Cash is a mixer, a tool that allows you to hide the source of cash on the ethereum blockchain. In very simplified terms, you send cash to Tornado Cash and get a voucher out in a nice round number (say, 100 ETH); whenever you want to redeem your voucher, you send it back, and the money goes to an address you control.
As you might imagine, that makes it crucial for money laundering on the blockchain. Lazarus Group, the North Korean hacker unit that stole more than $500m from crypto game Axie Infinity, has been slowly sending all that loot through Tornado Cash, allowing it to turn it back into more useful fiat currency without raising any red flags.
The whole thing is completely decentralised. Even Tornado Cash developers can’t stop it operating, let alone intervene to block suspicious users. Some argue that it has legitimate uses – if I want to send you money without letting know know how much I have in my wallet, then a service like Tornado Cash might be useful – but every legitimate user also provides further cover for the money laundering. In recent months, a fifth of all the money flowing in to Tornado Cash came from Lazarus Group alone.
So the US Treasury has acted. “Tornado Cash has been the go-to mixer for cyber criminals looking to launder the proceeds of crime, as well as helping to enable hackers, including those currently under U.S. sanctions, to launder the proceeds of their cyber crimes by obfuscating the origin and transfer of this illicit virtual currency,” a senior Treasury official said. “Since its creation back in 2019, Tornado Cash has reportedly laundered more than $7bn worth of virtual currency.”
But this is unlikely to be the end of things. For one, Tornado Cash is, well, a money laundering service. By its nature, it’s impossible to prove that you actually initiated a transaction – even if you received cash from the service’s address. I could theoretically get a random American in a heap of trouble by sending them money through Tornado Cash and they’d have no way of stopping me. Or they could just claim that had happened when quizzed, and have no way of being proved wrong.
This isn’t just theoretical: in the last 24 hours, users have actually done this, withdrawing 0.1ETH – £146 – to publicly available addresses.
I’m not sure it makes the point that cryptocurrency fans hope, though: yes, it does make it look almost impossible to enforce money laundering regulations without treating any crypto user as potentially criminal. That… doesn’t strike me as an outcome that is desirable if you are a crypto user?
More generally, Tornado Cash is at heart just a smart contract running on Ethereum. The US Treasury can play Whac-a-Mole sanctioning individual contracts as they pop up, but it won’t move the dial until it takes the more general approach of declaring mixing services verboten. That doesn’t appear to be on the cards any time soon, but how quickly that day comes depends on how actively people decide to poke the bear.
The wider TechScape
The Observer had a pair of fantastic pieces about the deeper battles in AI: Gary Marcus, a machine learning pioneer, wrote a primer on the cutting edge of the technology, and Steven Poole took a look at the recent revival of the Cartesian fear that the world might be a big simulation.
Another crypto not-bank account has paused withdrawals to prevent a not-bank run. Hodlnaut says “halting withdrawals and token swaps was a necessary step for us to stabilise our liquidity”. The company had a huge exposure to defunct hedge fund 3AC, which went bust and won’t be paying back loans any time soon.
Marc Andreessen, the billionaire Facebook backer whose venture capital firm, Andreessen Horowitz (A16Z) is one of the most influential in Silicon Valley, wrote a much-read essay at the dawn of the Covid pandemic arguing that “it’s time to build”. Well, two years of working from home has clearly played as much of a toll on him as it has on the rest of us: his latest missive was posted, not on his website, but on his local council: “IMMEDIATELY REMOVE all multifamily overlay zoning projects”, he wrote. “They will MASSIVELY decrease our home values”. It’s Time to Build Anywhere But My Backyard – ITTBABMBY?