Carrington Legal

Carrington Legal

Legal Services

Values-driven consultant general counsel delivering strategic, commercial and, most importantly, straightforward advice.

About us

Values-driven general counsel and company secretary with experience advising executives and boards, establishing and growing teams, reducing spend, fixing problems and enabling delivery. With past roles across the sport, technology, health, charity and government sectors, my aim is to support organisations to deliver through strategic, commercial and most importantly, straightforward advice. Advice - With over 20 years experience in London and Melbourne, in-house and in global law firms, and with the last 12 years as General Counsel, I can provide pragmatic, solution focussed advice across a breadth of practice areas. My particular focus is on privacy, cyber, IT, intellectual property, sports law, child safety, commercial contracts, franchising, corporate governance and company secretarial. In-house Support - For organisations that need more comprehensive in-house support, either on an interim, project-specific or ongoing basis, and which may not be able to sustain a full-time employed General Counsel, I offer part time services, with a range of cost models, including reduced rates for not-for-profit clients and pro bono opportunities. Sport Investigations & Member Protection - I have particular experience with sporting clubs and regulators, providing outsourced member protection services, managing sensitive investigations and reviewing child safety compliance. Consulting - With a history of establishing first time legal functions, I can also review enterprise operations, identify where legal risk lies and advise on structure, compliance gaps and process improvement.

Website
www.davidcarrington.com.au
Industry
Legal Services
Company size
2-10 employees
Type
Privately Held
Founded
2021
Specialties
Privacy Law, Sport Law, Cyber, Legal Services, Consulting, Governance, Integrity, Contracting, and Intellectual Property

Employees at Carrington Legal

Updates

  • New guidance this week from the Office of the Australian Information Commissioner on the intersection of AI and privacy for businesses, and on privacy for charities and NFP's is very welcome and will help organisations determine a path forward in terms of their privacy compliance But are executives and boards that ignore the guidance merely negligent or are they heading towards recklessness? The more guidance from regulators in this space, particularly around handling sensitive data, the more difficult it will be to avoid that conclusion. With recklessness a requirement under the new statutory tort for serious invasions of privacy allowing individuals to sue organisations that misuse their data, suggest that the guidance is worth a read...

  • Her - What did you do today Dad? Me - Well the Privacy Commissioner released new guidance about the privacy issues raised by AI, so I've been reading that. Her - What's for dinner? Still struggling to get my 11 year old to care about AI. Perhaps I'll let her watch Terminator. Organisations that are adopting AI - which is every organisation, whether they realise it or not - do need to care. The latest guidance from Commissioner Carly Kind offers fantastic insights and is a vital resource to help organisations avoid breaching the Privacy Act and consumer trust. I'll be sharing more during the week, but its particularly relevant to those organisations with long held datasets that they may think can easily be repurposed to train AI models. Unsurprisingly, the consent I gave 10 years ago likely can't be relied on. And why do you even still have my data? https://lnkd.in/gWQanb6m

    New AI guidance makes privacy compliance easier for business

    New AI guidance makes privacy compliance easier for business

    oaic.gov.au

  • It can be hard to get excited about compliance. However, it is necessary. Though I've had many clients who don't share that view and who still need convincing. Privacy compliance is no exception. Many organisations are happy to do the bare minimum, or less, and hope for the best. I could post a thousand "time to get your house in order" calls to action and they won't resonate. If the new Privacy Act enforcement regime isn't enough to focus the mind, perhaps consider how compliance can be a competitive advantage. The answer lies in trust - hard earned and easily lost. Case in point, the news that Seek is scraping customer data to train AI will inevitably impact customer trust. Competitors who can demonstrate that they meet customer expectations in terms of trust will increasingly reap rewards. (It is no coincidence how many IT vendors now have specific "trust centres" on their website). Organisations that continue to breach that trust won't survive. Maybe compliance isn't all that bad.

    • No alternative text description for this image
  • As the changes to the Privacy Act meander through the Senate Legal and Constitutional Affairs Committee, Big Tech has stayed largely quiet. Only Google seemed willing to stick its neck out and lodge a submission. Guess whether they argued for (a) a narrower definition of personal information, (b) dilution of the statutory tort for serious invasions of privacy, (c) for the tort not to apply to search engines at all, or (d) all of the above?

    • No alternative text description for this image
  • Insights out of the US last week from Allianz's annual cyber risk outlook, which considered the growth in privacy litigation. "The evolving regulatory and legal environment has brought an uptick in so-called ‘non-attack’ data privacy-related class action litigation, resulting from incidents such as wrongful collection and processing of personal data – the share of these claims has tripled in value in two years alone.” In short, organisations aren't meeting their data and privacy requirements, and regulation and claimant lawyers are finally catching up. With the statutory tort for serious invasions of privacy due to arrive in months, we can expect a similar uptick in Australia. https://lnkd.in/grcZNXmE

    Cyber security resilience 2024 | Allianz Commercial

    Cyber security resilience 2024 | Allianz Commercial

    commercial.allianz.com

  • Liverpool's Steven Gerrard was known for the occasional "reducer" - a brutal tackle early in the game to let the opposition know he was serious and meant business. In a few months, the Information Commissioner will have a range of shiny new powers under the Privacy Act, including infringement notices. National and state clubs and sporting bodies collect vast amounts of personal information, including that of children. For those who haven't reviewed their privacy compliance for a while, don't be surprised if the Information Commissioner comes in with a reducer...

    • No alternative text description for this image
  • The new right for people to sue companies that seriously invade their privacy will apply to everyone - not just companies with a revenue over $3m as is the case for the Privacy Act. That means that in a little over 6 months time, it'll be open season for claimant privacy lawyers. Having worked in the "golden age of privacy" in London in the noughties when a similar right existed, with its privacy super-injunctions and lawfare, rest assured that there will be firms whose entire business model is built around pursuing companies under the new tort. Organisations should be reviewing their practices now.

    • No alternative text description for this image
  • Former FBI Director Robert Mueller once noted “There are only two types of companies: Those that have been hacked and those that will be.” In the age of AI, there are now those companies whose data is being used to train AI models, and those whose data will be. With vendors falling over themselves to release new AI functionality, whether you want it or not, lax vendor management processes and shadow IT meaning staff are using tools without due diligence, rest assured that your data is heading out the door and into AI models. For every LinkedIn that gets found to have automatically opted people in to having their data scraped, another ten are doing so without any fanfare. That data may be personal information, bringing you into the realm of the Privacy Act, and shortly, a new statutory tort for serious breach of privacy, or it may be proprietary, meaning that trade secrets might be inputted and at risk of disclosure to competitors. 3 ways to fix the problem: - Read the new Australian voluntary AI guardrails and treat them as mandatory - they almost certainly will be soon. - Review your vendors - pick your top 10 vendors who handle the bulk or most sensitive of your data, and ask them what AI they use, where it's hosted, who has access to it, and how it's trained. Then assess the risk and document it. Plus check their T&C's to make sure the sales pitch matches the legal position - vendors are increasingly expert at vague AI answers on their websites. - Continue to educate your staff about not putting confidential information into any model. The role of Legal is not to terrify an organisation into banning AI, but to help educate it so it knows the risks it is taking. (credit to deepAI for image of scary AI lawyer - not self portrait)

    • No alternative text description for this image
  • "Ethics frameworks around the globe have indicated that AI should not be used in high-risk use cases. It is difficult to imagine a higher-risk use case than child protection, where an incorrect opinion could result in lasting serious harm to a child, parents, or both". Sean Morrison - Victorian Information Commissioner Yesterday's report from the Victorian Information Commissioner into the use of Chat GPT by a state government child protection worker is a challenging read, as well as a cautionary tale for the use of Chat GPT. It was found that the department had no controls targeted at addressing specific privacy risks associated with ChatGPT and GenAI tools more generally. It also makes clear that the use of AI by the department, and the child protection team, is far wider than the single case that has made headlines. Horror stories about poorly controlled AI usage in organisations are commonplace - the risk though has been more likely to be considered hypothetical eg a loss of proprietary data once it gets inputted into a model. However, organisations now need to be considering this through a privacy lens. With a new statutory tort for serious invasions of privacy imminent, a real risk exists that the subject in a case such as this may have a civil right to seek damages. Amending your privacy policy to facilitate AI consent is one part of the solution, but unless internal controls are tightened too, then organisations will end up paying the price.

    • No alternative text description for this image
  • Is data scraping the new data breach? In the rush to chastise LinkedIn for having scraped content for AI models, we appear to have collectively forgotten how much data we have willingly given to social media platforms for the last decade and more, who then commercialise it via ad revenue. It also brings to mind the outrage of those first major data breaches in Australia. Given how common they now are, the level of public interest has markedly dropped. Perhaps that outrage is now being reserved for AI data scrapes which seem likely to increasingly dominate the conversation. In many ways they are similar, involving as they do the perceived loss of data. We may come to see that consumer trust takes a hit as a result. If companies were clearer with consumers as to how they were going to use the data, which AI models, where, etc, then that loss of trust might be avoided. Case in point is the news this week of a medical imaging lab that is reportedly permitting its database of scans to be used to train an AI model, with questions being asked over the adequacy of the consent it has to do so. Many people would support their scans (suitably anonymised) being used to help train AI models that help detect problems in other patients. But if that were done without clear, express consent, that support could quickly change. In practical terms, sourcing consent isn't difficult, it just requires transparency and simplicity in your privacy policy (concepts that underpin the government's new voluntary AI guardrails and the latest proposed Privacy Act changes). But organisations need to make a choice about whether they want to prioritise consumer trust, or scrape data without clear, express consent, and risk the public's fury.

    • No alternative text description for this image

Similar pages