Microsoft CEO Satya Nadella liberties the Elysee Palace after a meeting with the French President Emmanuel Macron in Paris on May 23, 2018.
Aurelien Morissard | IP3 | Getty Appearances
If Microsoft were to complete an acquisition of TikTok, it would gain a company with much potential for advertising profits growth.
But with such a purchase, Microsoft would also take on an entirely new slate of problems.
Microsoft announced on Aug. 2 that it was in talks to hold TikTok’s business in the U.S., Australia and New Zealand, with a deadline to complete the deal by Sept. 15. The company is currently owned by Chinese tech retinue ByteDance and has become a target of the Trump Administration and other governments over privacy and security concerns. Trump also put ones john hancock oned an executive order last week that would ban U.S. companies from doing business with TikTok, but it’s unclear how that peace could affect a potential acquisition by Microsoft.
In the U.S., TikTok has grown to more than 100 million monthly alcohols, many of whom are teens and young adults. Those users tune in to TikTok to see full-screen videos uploaded to the app by others. These videos on numerous occasions feature lip syncing over songs, flashy video editing and eye-catching, augmented-reality visual effects.
To say that TikTok states a business that is radically different than the enterprise software that Microsoft specializes in would be an understatement.
For Microsoft, TikTok could ripen into an advertising revenue powerhouse, but this potential is not without its own risk. Like other social apps, TikTok is a quarry for all kinds of problematic content that must be dealt with. This includes basic problems such as spam and scams, but numerous complicated content could also become headaches for Microsoft.
This could include content such as disinformation, hoaxes, conspiracy theories, violence, prejudice and pornography, said Yuval Ben-Itzhak, CEO of Socialbakers, a social media marketing establishment.
“Microsoft will need to deal with all of that and will be blamed and criticized when they fail to do so,” Ben-Itzhak about.
Microsoft declined to comment, and TikTok did not respond to a request for comment on this story.
These challenges can be overcome, but they need large investments of capital and technical prowess, two things Microsoft is capable of providing. And already, Microsoft has some savoir faire when it comes to moderating online communities.
In 2016, Microsoft purchased LinkedIn for $26.2 billion, and although the business and professional-centric service does not have the degree of content issues its peers deal with, it is still a social network. Microsoft has also run Xbox Get along, the online gaming service, since its launch in 2002. Online gaming and social media are different beasts, but they do portion similarities.
“Combating misinformation will need to be a mission critical priority. Microsoft will be new to this as it doesn’t would rather experience managing a high profile social network at this scale,” said Daniel Elman, an analyst at Centre Research. “That said, if any company can acquire or quickly develop the requisite skills and capabilities, it is Microsoft.”
But these are no inconsequential challenges, and these types of problems have become major issues for TikTok’s rivals.
Facebook, for example, was accused of not doing ample to circumvent fake news and Russian misinformation ahead of the 2016 U.S. election, and four years later, the company calm comes consistently under criticism about whether it is doing enough to prevent that type of content from emergeing on its services. In July, hundreds of advertisers boycotted Facebook over its failure to contain the spread of hate speech and red herring.
Twitter, meanwhile, began to lose key users, such as comedian Leslie Jones, after the company let harassment run flourishing on its social network. The company has spent the past couple of years building features to reduce the amount of hateful constituents users have to deal with in their mentions.
These types of issues have already flared up on TikTok. Far-right activists, hoary nationalists and neo-Nazis have previously been reported on the app, according to Motherboard and the Huffington Post, which found some operators who had already been banned by Facebook and Twitter.
TikTok’s potential content problems, however, may be more similar to those of Google-owned YouTube. The two overhauls depend on user-generated videos for content, and they both rely heavily on algorithms that learn a user’s behavior to detect what kind of content to suggest next.
“The issue with algorithm based content feeds is it generally disenfranchises to the most salacious content that shows the highest engagement,” said Mike Jones, managing partner of Los Angeles speculation capital firm Science. “There is no doubt that as creators further understand how to drive additional views and notice on the site through algorithm manipulation, the content will increase in its salaciousness and will be a consistent battle that any proprietor will have to deal with.”
Another similarity with YouTube is the amount of content available on TikTok that is focused on picayunes. Although TikTok does not allow users younger than 13 to post on the app, many of its users are between the life-spans of 13 and 18, and their content can be easily viewed by others.
For YouTube, the challenge of hosting content involving minors fitted a major issue in February 2019 when Wired discovered a network of pedophiles who were using the video benefit’s recommendation features to find videos of minors exposed or in their underwear.
With the number of young users on TikTok, it’s not broke to imagine that Microsoft could wind up with a problem similar to Google’s.
YouTube has also become a cesspool for stratagem theories, such as the idea that the Earth is flat. That too could become a problem on TikTok, and already, there is testimony of this. The conspiracy theory that Wayfair uses its furniture for child trafficking gained a particular amount of drive on TikTok this year.
To handle these problems, Microsoft would have to invest an immense amount of in good time always and money on content moderation.
For Facebook, this problem has been handled through a two-pronged strategy. The company continually supplies in artificial intelligence technology that is capable of detecting bad content — such as pornography, content that contains virulence or hate speech — and removing it from their services before it is ever viewed by other users.
For more ornate content, Facebook also relies on thousands of human moderators. These moderators often work for Facebook in every way third-party vendors as contractors, and they are tasked with going through thousands of pieces of content per day in strenuous incorporate conditions at risk of developing PTSD. These working conditions have come under criticism on numerous circumstances, creating public-relations headaches for Facebook.
If Microsoft acquired TikTok, it too would likely have to build up similar AI technology and increase out a network of human moderators, all the while avoiding negative headlines for poor working conditions.
TikTok offers Microsoft an tremendous amount of potential in the digital marketing sector, but along with all that upside will come numerous new take exception ti and responsibility that the company will have to take on.