/ 2 July 2024

New book explores emerging metaverse economy, AI creative revolution

Gettyimages 1505949627 170667a

We came out of lockdown only to find that we had “rush hour in the metaverse”. Brands and businesses were staking claims in this virtual realm without quite knowing why and what the outcome would be, but instinctively understanding that they needed to.

The metaverse economy is now starting to take shape as the technologies required to play in this parallel universe become more accessible. Landmarks are being built, countries are creating cyber services for their citizens, universities have opened and retail brands have started to trade.

As the metaverse grows and evolves, so too does a new AI creative economy, one that is raising many eyebrows. An algorithm won an art contest and ignited controversy in the art world, while Ai-Da, the world’s first ultra-realistic humanoid robot artist, was invited onto a panel at the global culture summit in Abu Dhabi.

Machine writing is becoming more sophisticated and the AI chatbot (ChatGPT) can code, compose music, write essays and movie scripts, and answer questions more efficiently than Google. It can also admit mistakes, challenge incorrect premises and reject inappropriate requests.

The question now becomes whether technology is our friend and helper, or our ruler.

Somebody’s watching you

“Shameware” refers to apps that use phone-monitoring technology to monitor digital behaviour. This information is then sent to a so-called accountability partner, such as a parent, or someone intent on helping you, like a teacher, mentor, pastor or therapist. 

Your phone use is then tracked, capturing screenshots, detecting the apps being used and recording websites visited and reporting questionable URLs or online behaviour to your accountability partner. 

Many proponents believe there is a growing moral crisis and that these apps serve to dissuade certain kinds of behaviour, such as the watching of porn. Covenant Eyes is one such app. 

On its website it encourages you to “join over 1.5 million people who’ve used Covenant Eyes to experience victory over porn”, and Gracepoint, a Southern Baptist church in the US, has reportedly recommended this app to congregants. 

Accountable2You is another example. These apps generate strong and divergent opinions. Anti-porn advocates, for example, see them as innovative tools to further their beliefs, but for critics they are an invasion of privacy, with many “users” unlikely to understand the extent of the surveillance to which they are exposing themselves. 

For example, in a Wired magazine test of Accountable2You, the app identified content with the keywords “gay” or “lesbian”. Not surprisingly, the ethical implications of this controversial technology are profound. 

There is also scepticism over whether these apps have a positive effect on users. Nicole Prause, a scientist at the University of California who studies the effect of pornography on the brain, says, “I’ve never seen anyone who’s been on one of these apps feel better about themselves in the long term. These people just end up feeling like there’s something wrong with them.”

There’s clearly an appetite for this type of accountability surveillance technology and it remains to be seen whether it will proliferate.

Indeed, we have also seen corporations installing similar software to monitor not only employee online behaviour but even attitudes and emotions at work in order to improve productivity and keep tabs on morals. 

Personal surveillance technology, however, is rather obviously mired in controversy and businesses would be well advised to consider the implications of employing surveillance technology to spy on employees, even for well-intended reasons. In general, relationships based on trust are preferable to those based on suspicion.

Companies with high trust levels outperform companies with low trust levels by 186%. Trust between managers and employees promotes staff retention and productivity, and if there is a breakdown in trust, employees may become unreliable, disengaged, disloyal or uncommunicative. 

Policymakers should put in place laws that protect citizens from invasive and sometimes nefarious use of surveillance technology. In other words, control should not come at the expense of our human right to privacy.

Luxury surveillance

Luxury surveillance refers to people voluntarily paying a premium to be monitored by smart devices. Amazon Alexa, Fitbits, Discovery Car Insurance app and car tracking and other health-tracking apps fall into this category, as does Amazon’s Ring camera and, more locally, Vumatel’s neighbourhood community surveillance networks. 

Some argue that they are not that dissimilar to the ankle bracelets used on parolees or immigrants awaiting hearings. These surveillance devices are expensive and only really affordable for the wealthy, a kind of status symbol. 

They exist to make life more convenient, and promise benefits for health and security, but the data collected may still be used to target individuals with personalised advertisements. They are also used to effect positive behavioural changes, such as improving fitness, which can be seen as manipulation — albeit in a positive sense — to act a certain way. 

However, studies show that these devices are not particularly effective at changing behaviour. This normalisation of — and even demand for —surveillance for health and security reasons also raises concerns about violating the privacy of individuals, especially as those who do not wish to opt in feel increasing pressure from peers and organisations to share their data with devices and businesses.

Take security cameras at the entrance to homes that enable people to see into their neighbours’ yards, for example. Is there any way to opt out? 

Also, what happens to all the data that is collected? “These gadgets are analogous to the surveillance technologies deployed in Detroit and many other cities across the [US] in that they are best understood as mechanisms of control: they gather data, which are then used to affect behaviour,” says Chris Gilliard for The Atlantic.

The data collected can theoretically be leveraged against people by their employers, the government, their neighbours, stalkers or domestic abusers. Potentially, it enables the study, prediction and control of human beings and populations. 

As always, we need to weigh up the individual and collective benefits of technology, when it comes to convenience, efficiency, health and security, with the social costs.


Not very long ago, starting your own business required a significant investment in time and money. Recently, however, there has been a proliferation of AI tools to help would-be entrepreneurs to get a business off the ground within minutes and with practically no upfront investment. 

Online tools can assist with every­thing, from logo design to website building, and some go as far as facilitating the actual production of goods. One such platform is CALA, which deploys AI and machine learning to streamline the entire fashion supply chain.

More than 40 brands and independent designers currently use this technology. Copy.ai is a tool to create human-sounding content for a business’s website and social media platforms.

The barriers to start, let alone succeed, in a new business are high. Often, founders start new ventures as side hustles while working for an employer, so their spare time is limited. 

Another struggle is access to funding. These AI tools, however, allow time and money to be saved for business-building endeavours other than those the tech can supply. There’s also the added bonus of empowering more disadvantaged entrepreneurs. 

These wins don’t only apply to new businesses, though. Established enterprises can also adopt technologies in order to automate existing processes or launch new products.

As a result, businesses need to keep abreast of the latest AI tools to ascertain which can benefit their companies. In fact, hiring AI specialists who understand the complexity of the field is fast becoming an important requirement. If businesses are unable to afford a dedicated specialist, they are advised to seek the help of external consultants.

Intersoft, for example, offers AI advisory services to help its clients grow their businesses, while EY also offers AI consulting services.

However, businesses should remember, too, that technological competitive advantage is short-lived — that which can be digitised and automated can also be replicated by your competition. 

So, by all means, use the tools that are available, but be wary of relying on AI completely for business advice or future-proofing.

The Future is published by Tafelberg, a division of NB Publishers.