It may sound like science fiction, but for many people, it is a reality. Ed Santow, former human rights commissioner of Australia, warned that Australia was “sleeping its way into mass surveillance” by failing to take digital privacy seriously.
Privacy in the digital environment
Companies have been collecting information about their customers for many decades. You’ve probably done what marketing agencies refer to as a “Value Exchange” if you have ever joined a loyalty program such as FlyBuys. You’ve given away your personal information in exchange for discounts or special offers.
Consumer data is big business. A report by digital marketers WebFX revealed that in 2019, data from 1,400 loyalty programs were routinely traded around the world as part of an industry valued at around US$200 Billion. In the same year, a review of loyalty schemes by the Australian Competition and Consumer Commission revealed that many loyalty programs lacked transparency and discriminated against customers.
The digital world makes data collection easier. The company can tell, for instance, what you watch on Netflix and when you watch it. They go even further and capture data about which episodes or scenes you watch again and again, as well as the ratings you give your content.
Hyper-collection: a new threat to privacy
The Australian Information Commissioner ordered the controversial tech firm ClearView AI to stop “scraping” or collecting images from social media for its massive facial recognition database. In just this month’s investigation, the Australian Information Commissioner investigated several retailers who were creating face profiles for their customers.
This new phenomenon, “hyper-collection”, represents a growing tendency by large companies to gather, sort, analyze and use more data than they need. It is usually done covertly or passively. Hyper-collection often has no legitimate legal or commercial purpose.
Hyper-collection and digital privacy laws
In Australia, hyper-collection poses a serious problem for three main reasons.
The privacy laws in Australia were not prepared to deal with the likes of Netflix and TikTok. Numerous Amendments notwithstanding, the Privacy Act dates from the late 1980s. The recent change in government has delayed the review that former Attorney-General Christian Porter had announced for late 2019.
Second, Australian privacy legislation is unlikely to be enough to affect the profits of foreign companies – especially those based in China. The Information Commissioner can order companies to do certain things, as they did in 2021 with Uber. Court orders are also available to enforce this. The penalties are not enough to deter companies that have billions in profits.
Read more: 83% of Australians want tougher privacy laws. Now’s your chance to tell the government what you want.
Third, hyper-collection is often enabled by the vague consents we give to get access to the services these companies provide. Bunnings, for example, argued that its collection of your faceprint was allowed because signs at the entry to their stores told customers facial recognition might be Meanwhile, online marketplaces like eBay, Amazon, Kogan, and Cale supply “bundled consents”. Basically, you have to consent to their privacy policies as a condition of using their services. No consent, no access.
TikTok, hyper-collection and TikTok
TikTok, owned by Chinese company ByteDance, has replaced YouTube in terms of online video creation and sharing. The app, which is powered by an algorithm, has already attracted criticism due to its routine collection of data about users. It also received criticism for ByteDance’s secretive approach towards content moderating and censorship.
TikTok executives have been telling governments for years that data does not reside on servers in mainland China. These promises may be hollow after recent allegations.
A security guard tries to stop a photographer behind closed doors at the Beijing headquarters for TikTok owners ByteDance in August 2020. Wu Hong/EPA
Cybersecurity experts claim that the TikTok application regularly connects to Chinese servers and that employees of ByteDance, including the mysterious Beijing-based “Master Admin,” have access to every user’s information.
Just this week, it was reported that TikTok, owned by the Chinese company ByteDance, can access nearly all of the data on the phone on which it is installed – such as photos, calendars, and emails.
According to China’s National Security Laws, the government may order tech companies to pass that information on to police or intelligence agencies.
What are our options?
We don’t have much choice, unlike a physical shop, about the privacy policies of digital companies and how they collect information.
Vanessa Teague, an encryption expert at ANU, suggests that consumers delete the offending apps and wait until their creators agree to more transparency. This will, of course, lock us out of these services. It won’t have much impact on the company until enough Australians sign up.
Read more: Facial recognition is on the rise – but the law is lagging a long way behind.
Another option is “opting-out” of intrusive data collection. We’ve done this before – when My Health records became mandatory in 2019, a record number of us opted out. Though these opt-outs reduced the usefulness of that digital health record program, they did demonstrate that Australians can take their data privacy seriously.
How can Australians opt out of such a huge social app as TikTok exactly? They can’t right now – the government should explore a possible solution in its review.
The Privacy Act Review is also exploring whether new laws should be created that allow individuals to file lawsuits against companies for privacy violations. Even though lawsuits can be expensive and take a long time, they could cause big companies to change their behavior.
Australians must start being more aware of their privacy, no matter what option they choose. It may be that we read the terms and conditions and agree before accepting them or that we are prepared to “vote” with our feet if companies refuse to be transparent about how they use our data.