TNW Answers is a live Q&A belvedere where we invite absorbing people in tech who are much smarter than us to answer questions from TNW readers and editors for an hour.

Social media, a tool created to guard abandon of speech and democracy, has more been used in more adverse ways. From its role in amplifying political bamboozlement in elections, to inciting online violence, and lowering the levels of trust in media — Facebook isn’t just a space to share “what’s on your mind,” and you’d be naive to accept so. 

As technology advances, it’s acceptable more hard to detect fake news and manipulated agreeable online. To shed some light on the issue, bygone Samuel Woolley, Program Administrator of advertising analysis at the Center for Media Engagement, hosted a TNW Answers session.

[Read: ]

Woolley is the co-founder and former analysis administrator of the Computational Advertising Project at the Oxford Internet Institute. In his session, Woolley gave acumen into topics alignment from how he alone copes with researching the furnishings of fake news every day to how regulators draw the line amid advertising and 18-carat political opinion.

Here are the key takeaways from the session:

Woolley started his analysis into online advertising after acrimonious up an absorption in backroom during high school. 

“To be honest, most of my work is motivated by the goal of what anthropologist Laura Nader called “studying up” in her famous 1969 commodity ‘Up the Anthropologist,’” Woolley said. “The basic idea is that we need advisers and thinkers who study people in positions of power. I wanted to study people who anticipation abnormally than I did, governments who were manipulating citizens, that kind of thing. I always was fairly adamant and argumentative, so it worked well for me.”

In his book ‘The Absoluteness Game: How the Next Wave of Technology Will Break the Truth,’ Woolley discusses the key indicators that the next wave of bamboozlement is moving from social media to new frontiers including basic and aggrandized absoluteness platforms, AI-driven basic abettor systems, and other forms of technology advised in the human image.??

When asked what kind of advertising we should expect in the AR/VR space, Woolley explained: “I think that the bamboozlement and advertising we will see in AR/VR will rely, no abruptness perhaps, on the multi-sensory nature of these technologies.

“So the agreeable will look quite different. Whereas now, most bamboozlement is written, with more and more actualization as images and video, the next wave will be alternate and immersive. For instance, I give an archetype in the new book of how the Chinese Communist Party has tested using VR to test low-level party admiral on their knowledge. Basically, the Party gets these people in a VR room and quizzes them on their adherence to the CCP. This seems, to me, a lot more potent than say, a Skype call, because it is in VR.”

We’re living in a time where overextension misinformation is as easy as a Facebook Share or a Retweet. Woolley explained that propagandists are pragmatists use attainable and cheap tools to spread misinformation online. “As apparatus acquirements bots become cheaper and easier to make, they will be leveraged for advertising — we are already seeing this to some extent, but there is still time to act.”

Woolley’s analysis has made him “cautiously optimistic about the future.” He explained that while it’s true that we face technologically advised bamboozlement and advertising in amounts — and in ways — we’ve never seen before, it’s also true that much of it is rudimentary. 

“What we’ve got to look out for, and what I analyze deeply in The Absoluteness Game, is the beginning use of AI, VR, and deepfake video for the purposes of overextension disinformation. We are ahead of this botheration now, as I see it. But we’ve got to act.”

Read next: I’m the CEO and I sit in the middle of the office — here’s why