The original Boston Dynamics BigDog robots were creepy but unintimidating, scrabbling on terrain like students stumbling home after a night out. I first saw these videos during my engineering degree when learning about programming motion and self-correction. It was eerily heart-wrenching to see researchers kick these faceless robots and mesmerising to watch them regain their balance. Boston Dynamics even hammed up their robots’ innocence by making them dance. Fast-forward to Deep Tech Week 2024 and the latest Boston Dynamics robots look alarmingly similar to the inexorable killers of a Black Mirror episode.
It was once plausible to believe these robotic pups would only complete life-saving tasks like search-and-rescues and carrying battle victims over difficult terrain. Spinning out from academic research at MIT, Boston Dynamics has since passed through the hands of private investors like Google and SoftBank. Although they are now owned by a car company and serve private sector clients, as the technology has advanced, it is increasingly difficult to ignore their business potential as cold-blooded assassins. This blurry line between companies selling to the military in general versus providing weapons in particular matters for ESG investors, who typically screen out companies with negative social impact.
Boston Dynamics is aware of this perception, and pledged in 2022 that they will never weaponise their robots. Yet these kinds of promises may not be enough to prevent companies from pursuing a more sinister path in the long run. Open AI, the maker of ChatGPT, is the latest well-publicised example of a company set up with corporate governance to restrict its activities from becoming too dangerous for humanity. Money-making interests have since released it from the shackles of its independent, not-for-profit board.
Boston Dynamics is not alone in developing life-saving and defence logistics technology that could conceivably be used for harm. Skydio, a drone company, was originally launched with a noble mission to reduce the danger from construction site inspections and first-responders to incidents. They now have a defence product which helps troops with reconnaissance in the battlefield. There is no indication that they intend to do this, but given the use of autonomous drones for targeted assassinations already, it is not a stretch to imagine their drones becoming armed combat participants in the future.
As a society, we have accepted that the private sector is part of the military industrial complex. Huge companies like QinetiQ, BAE Systems and Lockheed Martin already supply deadly technology to the military. It is good business sense for start-ups to target defence customers. There is vast money to be made, and once awarded, contracts can be long, predictable and lucrative.
However, there is debate about the role of ESG investors to fund defence firms. ESG funds historically avoided investing in defence companies because of their potential for harm. Businesses like Boston Dynamics and Skydio would probably still qualify as typical ESG investments because they save lives and fall short of producing weaponry.
Yet since the onset of the Ukraine War, ESG investors have come under fire for excluding defence. The defence industry has argued that supplying weapons to Ukraine contributes to ‘social stability’. Our own Prime Minister supports defence companies to be eligible for ESG-related investment, because “there is nothing more ethical than defending our way of life.”
While I support the military, and am surrounded by a family of active or former service members, I fall short of agreeing with Sunak that we need to include defence in ESG frameworks. There is nothing stopping someone who is excited about the humanitarian benefits of defence firms to direct their investments towards these endeavours, and tax-payer dollars already go towards funding their own militaries.
Yet standards around ESG labelling exist to protect the average individual who wants their investment to “do good” but does not have the time or ability to diligence the funds investing on their behalf. Just because defence companies protect one side’s ‘way of life’, does not mean that they should be included in funds marketing themselves as having a positive impact on the world. Machinery designed to kill another living being does not seem like it fits that bill.
Specifically the killer old granny “ambulance robots” 🤖
The latest season of Dr. Who agrees with you: https://m.imdb.com/title/tt28764112/