The world’s first eSafety commissioner, Julie Inman Grant, sees no reason why the safety standards expected of machinery like cars shouldn’t exist online. Directors of companies with online activities would do well to take notice.
Before Julie Inman Grant became Australia’s inaugural eSafety commissioner in January 2017, she had spent more than two decades working in the technology industry. Today, she uses her insider knowledge about the way big tech companies operate to help make the internet a safer place — regarding herself as a “poacher turned gamekeeper”.
Inman Grant began her career in Washington as a legislative assistant in Congress. She worked in the non-profit sector before joining Microsoft, where she remained for 17 years.
“Sitting in security and privacy product reviews whilst at Microsoft was what inspired me to apply this to safety and personal harms, providing companies with a roadmap for assessing risk and building in safety,” says Inman Grant
She also held senior positions at Twitter and Adobe, but became increasingly disillusioned by the failure of large tech companies to treat user safety with the same seriousness that it treats the industry touchstone of IT security.
“When I joined Twitter in 2014, for example, I genuinely believed in the platform’s promise to be a vehicle for freedom of expression and allowing people to speak truth to power,” she says. “But gradually I began to see how platforms such as Twitter were being misused, leading to serious online abuse of individuals, particularly women and minorities.”
She observed that the bullying and abuse effectively silenced their voices, which led her to conclude that the technology was being used as a tool of suppression. It was around the same time that she was approached to take on the world-first regulatory role.
The office of the eSafety Commissioner was established under the Australian Communications and Media Authority (ACMA) in 2015, its purpose to better protect Australian children from online harms and redress the massive power imbalance between tech companies and ordinary citizens. In January 2022, Inman Grant was reappointed by the federal government for a second five-year term. This occurred at the same time as the Online Safety Act 2021 came into effect, broadening her powers to include being able to order social media platforms and other websites popular among children to remove cyberbullying content within 24 hours (the previous time frame was 48 hours). Failure to do so can result in a fine of up to $111,000 for individuals, and up to $555,000 for companies.
It also enabled her to order the removal of cyberbullying content aimed at adults, whereas she previously only had powers relating to content targeting children. The world-first adult cyber abuse scheme enables the eSafety commissioner to order the takedown of intimate images that were shared without the subject’s consent, abhorrent violent material and other forms of restricted online content.
“We also have a specific legislated power to order blocking or removal of certain terrorist and violent extremist content, including that related to events such as the 2019 live-streamed terror attacks in Christchurch or the Buffalo terror attack shooting,” she says.
This is complemented by the educational programs it carries out to help educators to empower children and adults to keep themselves safe. It is also working with the industry to enact mandatory codes and — through its Safety by Design initiative — to embed greater protections within the basic architecture of products.
Creating new codes
The new industry codes of conduct will apply to eight sections of industry — social media services, websites, search engines, app stores, internet service providers, device manufacturers, hosting services and electronic services, including email, messaging, gaming and dating services. Industries will be required to take reasonable and proactive steps to detect and remove illegal content such as child sexual exploitation material, and also take more responsibility in shielding children from harmful content.
“Seven years ago, there was no playbook for online safety regulation, so we’ve had to fill in the pages as we’ve gone along,” says Inman Grant. “But we now believe we have developed an effective and replicable model, which other countries are using as they take their first steps towards regulation.”
She is conscious of the need to make balanced and nuanced decisions about online safety regulation, especially as it has never before been addressed. “Some of the decisions I make to keep Australians safer online may require trade-offs and I accept I’m not going to make everyone happy, especially my former colleagues in industry. But I’m not here to make friends, I’m here to make tough decisions that protect the safety and wellbeing of Australians.”
However, she is pleased when relations remain cordial. Last month, she bumped into a former colleague, Microsoft president Brad Smith, during a conference. “He kindly said it was beneficial to have people like me, who spent their formative years in tech, shaping tech policy and regulation,” says Inman Grant. “I was also thrilled that he called our Safety by Design initiative one of eSafety’s crowning achievements.”
Inman Grant’s approach to the task of getting the industry to develop robust codes is deliberately non-combative, and she seeks to highlight best practice and industry innovation whenever possible. However, she says she is not shy about using the legislative powers available to her when incentives fail to spur meaningful action.
“One valuable lesson I learned from my time in the industry is that it’s better to do this with them rather than to them — the so-called co-regulatory model,” she says.
The complexity of the task requires a staged approach, with the first set of codes focusing on the most harmful content, including material relating to child sexual exploitation. If an industry’s draft codes do not contain appropriate community standards, the eSafety Commissioner will have the power to declare industry standards.
Cooperating to decrease harm
Due to her familiarity with the operations of tech companies, Inman Grant is under no illusion as to prevailing attitudes towards regulation within the industry. “Whether fledgling startups or billion- dollar corporations, the default position of the tech industry is generally that they want to be left alone by government, all governments,” she says. “There continues to be a firm belief within industry that a completely unregulated environment is the best way to innovate and drive profits. Yet we also know that many people join the tech industry because they believe in the power of technology to do good. They want to create products and services they can be proud of. They don’t want to contribute to a toxic cesspool.”
When developing its Safety by Design initiative, her office reached out to a broad range of tech industry members for input. This included the behemoths of Google, YouTube, Meta (Facebook) and TikTok, as well as Roblox, Lego and Nextdoor. The assessment tools were published in 2021 and the aim is for them to be used by tech companies to ensure they are building safety into their products and services. Inman Grant sees it as a potential game changer.
“Safety by Design as a concept encourages technology companies to anticipate, detect and eliminate online risks as the most effective way to make our digital environments safer, more inclusive and less toxic — especially for people who are most vulnerable,” she says.
The initiative provides companies with an online safety framework as well as practical tools for detecting risk and monitoring outcomes, including a safety impact assessment and a potential roadmap for building in safety through each stage of development, design and delivery.
Inman Grant is convinced that the radically different approach of building safety into a product, instead of tacking it on as an afterthought, makes good business sense. It prompts companies to invest in a better customer experience and saves them from having to clean up reputational “tech wrecks” later. She believes boards need to do more in Australia to ensure online safety for their workers. “There is a certain lack of awareness among Australia’s broader corporate leadership of the impact of online harm and their own responsibility in relation to it. There’s a view that online safety is really only relevant to the big consumer-facing technology companies.”
With the possibilities for misusing technology to cause online harms virtually limitless, any organisation with an online platform or service needs to step up. “If there’s one message I’d like to give, it’s this — the principles of online safety apply across the spectrum of business and community life. Whatever platform or online service you offer may be open to abuse, and the consequences for victims can be devastating, whether they be your workers, customers or anyone else.”
In March, four Australian regulators announced they had joined forces to regulate digital platforms more effectively.
The Digital Platform Regulators Forum includes the eSafety Commissioner, the Australian Information Commissioner (OAIC), the Australian Communications and Media Authority (ACMA) and the Australian Competition and Consumer Commission (ACCC)
“As a result of a couple of ACCC inquiries into digital platforms, there’s been evidence submitted by businesses that Google and Facebook and other large digital platforms are using their market dominance to extract unfair terms,” says Clare Giugni, a lawyer at Holding Redlich.
Currently, it is difficult for businesses to negotiate the terms of the products that the likes of Google provide — such as Google Analytics — simply because they’re in a far weaker bargaining position.
Smaller rivals are often pushed out of the market. Remedying these harms has historically been very challenging.
“The fact that four regulators have got together and created their own forum is an indication of the difficulty in regulating online platforms,” says Lyn Nicholson, general counsel corporate and commercial at Holding Redlich.
Some online platforms have been accused of discriminatory pricing practices — which is when someone is offered differential pricing based on what they are known to be willing to pay. It is another harm that is difficult to detect.
“Online platforms are very good at exploiting gaps in the legislation to enhance their business model,” says Nicholson. “Technology platforms continually say that they’re just a tool and that they’re neutral. However, their algorithms direct people in a certain way, so that is not neutral.”
For the purposes of the forum, a digital platform is defined broadly. It covers both big and small players that provide internet search engines, digital content, aggregators, social media services, private messaging services, media, referral services and electronic market pieces.
“As businesses seek to digitise and look to incorporate broader services in their offering, they also need to ensure they’re not misbehaving, but behaving in the same way as the big players,” says Nicholson. “If your business falls within the definition of being a digital platform, you need to make sure you’re complying with the law.
If you spam someone, for example, ACMA is going to share information with the Privacy Commissioner and the ACCC, who might then look at your other practices and investigate whether your terms are misleading and deceptive.”
Nicholson believes that the sharing of information could lead to early regulatory action and joint statements, which will give the public confidence in the overall regulation. She is doubtful whether anything beyond that may occur.
“The main challenge will be resourcing and funding,” she says.
“Each of those bodies already has their day job, and new tasks such as the sharing of information are an additional burden. That is why the best outcome we’re going to get is joint statements.
To do anything further is really hard. But equally, if they found a harm, they could coordinate the regulatory action, and if they all took different elements of it, that would be useful.”
Already a member?
Login to view this content