Algorithmic audits: an accountants view
A guest blog by Anjum Rahman •
Algorithms are used in a variety of ways for search engine results, to promote posts and increase virality, and to provide recommendations for similar content. It's important that we understand the impact of algorithms and the harms they may cause. This blog explores how an independent 'audit' could provide a mechanism to assess these harms. It uses the financial auditing process as an example of how an algorithm audit infrastructure might be established.
Algorithms and machine learning aren’t new, they’ve been around for a long while. We’ve had algorithms for around 4,000 years, and Ada Lovelace is credited with writing the first computer algorithm. Machine learning is more recent, and became more widespread in the 1990s. These are vast areas of knowledge with many practical applications.
The use of algorithms provides significant benefits, but they have the potential to cause harm. There are examples showing impacts related to ethnic cleansing and genocide, recommendation of ever more incendiary content that can radicalise viewers, or result in increasing discrimination and racism.
Up to now, tech platforms have operated without any significant legislation when it comes to the algorithms and machine learning they develop and use. Domestic legislation in many jurisdictions has struggled to keep up with the technology. Given the level of actual and potential harm, there is need for regulation and protection of communities.
This paper proposes an infrastructure for algorithmic audits that would allow independent assessment. The audit would provide a degree of assurance (or not) about any negative impacts of a platform’s algorithm and machine learning systems. An effective infrastructure can be based on the model used for financial audits. There are parallels with commercially sensitive corporate information, and the need for professional expertise and judgement to offer an opinion on the reliability of what is in the public domain.
Online platforms and radicalisation
Terrorism and violent extremism are terms which are fraught and contested. There is no internationally agreed definition of terrorism, and the instances of states using these terms in attempts to delegitimise and to criminalise resistance are too numerous to count. Moreover, nation states have, on occasion, terrorised both their own and foreign populations.
There are a variety of methods used to share and promote extremist content, with the aim of changing culture and attracting new recruits to the cause. Social media platforms have, to some extent, been active in dealing with the issue since 2016. They self-regulated through the development of a hash-sharing database, which currently sits with the Global Internet Forum to Counter-Terrorism.
Other tech platforms are not immune to exploitation by terrorists and violent extremists. Video gaming platforms have been a means of recruitment, and there are examples of Roblox being used as a platform for games that recreate the Christchurch mosques attack. Any platform where users have the ability to post content, and which uses algorithms to search, promote or recommend that content, is at risk.
Modern financial audits developed with the rise of jointly-owned companies, with ownership through stocks and shares. These were owners who did not have direct contact with the governance, management and operations of the business venture they had invested in, and they wanted assurance that the reports provided to them were accurate, and fairly reflected the operations of the company.
Professional auditors were employed to check the financial records as well as the measurement methods and the internal control systems of companies. In order for a financial audit to be reliable, there were a number of things that had to be in place. It required an infrastructure of trained people, with related regulations and shared agreement on principles and practice.
Audits should be underpinned by a set of principles or values. These provide guidance as to the purpose and process of audits. Standard audit principles focus on integrity, confidentiality, independence, obtaining appropriate evidence, and managing audit risk. These principles guide the planning of audits and the conduct of auditors. Some of these principles are explored in more detail below.
Principles underwrite the basic purpose of audits: that users are able to rely on financial information provided by those who report it, whether publicly or to a select group. There is general agreement in society that financial information should be reliable, and that significant harm happens when it is fraudulent or there are major errors. Audit principles have developed over years of practice, and through a process of consensus.
As organisations operate across borders, audit standards have needed to be consistent across jurisdictions. States have recognised the value of this consistency, and many subscribe to international standards as the basis of their own regulations.
When it comes to online spaces, there is still much work to do to build consensus around principles, or indeed the general purpose of an algorithmic audit. While a free, open and secure Internet is understood by many as a basic principle, many nation states have significant controls that are inconsistent with it. To the extent that those states have removed access to online spaces for periods of time to quell protest, dissent or violence.
It is critical that work towards such consensus is built, across borders and with the involvement of communities that use and are impacted by online spaces. The Christchurch Principles are an example of a starting point. They were developed in the wake of the terrorist attacks on Christchurch mosques.
They are important, as they begin to provide a basis against which the performance of algorithms can be measured and an opinion issued. They are based on improving democratic outcomes and reducing harm. While these principles can be interpreted in different ways, professional judgement is part of the work of any financial auditor, and this would also be the case for auditors of algorithms.
Another example of a set of principles is The Signal Code, which uses a human rights-based approach.
One of the main points of resistance to algorithmic audits from online platforms has been commercial sensitivity. They don’t want external parties to have access to code, which is understandable. In fact, given that the code is now being developed by machine learning, some have told us they don’t exactly know what’s in the code.
However, it is not the code itself that is of relevance to the wider public. Rather, it is the way the algorithms work and the impact that they have on the lives of people that is of concern. In order to assess this, researchers would need to have access to data and internal records, reports and correspondence. Of course, all of this material is shrouded in commercial sensitivity.
This is no different to the financial records, minutes of meetings and other records that are accessed by financial auditors. Financial auditors have access to any computer programmes used to construct the financial instruments, they consider why particular measurement methods have been used and what impact the decisions by governance and management have had on the financial statements.
They gain access to these internal records through a strong and legal requirement of confidentiality. No member of an audit team can publicly disclose anything they have seen or have access to during the course of their audit. If they do find anything illegal, they will be required to disclose to the relevant state agencies dealing with financial fraud.
What they disclose publicly is an opinion. Generally, an audit opinion will be less than two pages, and will provide reassurance that the financial reports fairly reflect the financial activities of an organisation. If there are significant mistakes, errors or deliberate misstatements that the organisation refuses to correct, these will be listed, along with any breaches of financial standards. Given that no organisation wishes to have any such things on their audit opinion, they tend to make the changes requested by the auditors or disclose in the financial statements any particular issues.
Sitting behind the audit opinion is weeks (if not months) of investigation and detailed access to information. The general public sees only the public financial statements and the audit report, confidentiality is guaranteed. A similar level of confidentiality can be put in place for algorithmic audits, but only so long as the rest of the infrastructure is in place. This will remove any reasons for resistance based on commercially sensitive information.
An additional level of complexity for algorithmic audit is private and encrypted information. Individuals can and should expect privacy of content that isn’t publicly available. However, financial information also includes private information, for example payroll data or communications from third parties.
As long as that information is kept confidential and not disclosed by auditors unless is it illegal, then the confidentiality requirement will keep the balance between privacy and reduction of harm.
The public can only rely on an audit opinion when they are sure the person providing such an opinion is completely independent of the organisation being audited. Auditors take independence incredibly seriously, as it is the basis of trust in the audit. It is not enough for an auditor to be actually independent, they need to be seen to be independent as well.
For this reason, many audit firms will not allow partners or staff to own shares in an organisation they are auditing, no matter how small the amount. All staff are required to disclose annually all of their shareholdings.
But influence doesn’t only happen through ownership, there are other ways a person can control an organisation. Which is why the rules are broader, they take into account board memberships, associated persons (family members or family trusts that might have ownership) and other matters.
Auditors, by the nature of their work, will have a good understanding of the financial processes and internal controls of an organisation, as well as the weaknesses in them. They are often best placed to provide advice and services to improve these systems. However, when an audit firm provides a significant amount of other services, it impacts the independence of the audit. The firm begins to rely on the other income and it becomes more difficult to jeopardise it by providing an adverse audit opinion. This is why, in some jurisdictions, audit firms are prevented from providing any other services to the organisation they are auditing. They can provide needed improvements as part of the audit process, but can not be involved in implementing those improvements.
For algorithmic audits to be effective, regulation of the independence of auditors is essential. That regulation may be set by an industry body of auditors or it could be through regulation from government. The latter is a much more cumbersome process, and regulation by professional bodies has sometimes been effective.
A financial audit does not involve the review of every single financial transaction. This would not be possible, given the volume of financial transactions for any medium to large organisation. It would be difficult even for smaller ones.
A sampling approach is therefore used to test the reliability of systems. Internal control systems are the processes and procedures put in place to ensure that errors or deliberate falsehoods are caught and corrected. They are the checks and balances built into the system to ensure accuracy.
In a financial system, internal controls will include things like division of labour, so that more than one person is involved in ordering goods, authorising the resulting invoice and payment authorisation. It might include ensuring that accounting software will flag unusual entries, for example a payment entered into an income account. It will definitely include checking minutes of governance board meetings to check approvals and major events over the past year.
Picking a sample of transactions is a matter of judgement based on the size and type of organisation being audited. For smaller organisations, auditors may pick a random month and check all transactions that happened in that month. In addition, they will check all the major transactions, and these are defined by the size of the organisation or impact of the transaction.
If checking of the sample brings up a number of anomalies, then the sample size will need to be broadened. The scope of the audit becomes wider, which takes more time and effort. Again, it is a matter of professional judgement as to whether the anomalies are numerous or significant enough to merit the additional work.
The work continues until the auditor feels assured that enough evidence has been obtained to give assurance that the financial reports provide a fair reflection of the financial activity of the organisation. Ultimately, it is the auditor’s reputation and livelihood on the line if they provide assurance that isn’t backed by adequate evidence.
An algorithmic audit would also be testing the internal controls of an online platform. In the area of terrorist and violent extremist content (TVEC), this could include the accuracy of algorithms that remove content. Are the algorithms removing all the content that should be removed, or is the algorithm missing significant harmful content? Or are they removing content that shouldn’t have been removed, for example due to political pressure or poor system design, poor definitions, institutional bias and so on.
Similar to financial audits, the volume of material for any individual platform will be too high to test in full. Methods to determine sample sizes that would give sufficient evidence would need to be developed. Those methods need to be flexible enough to fit different circumstances, and yet have some basic uniformity of understanding so that assurance would be possible across all platforms. Not a simple task, but it can be done with expertise, professional judgement and sufficient resourcing.
Given the vast difference in the types of organisations that conduct financial activities, there is the need for a set of standards that will guide the preparers, auditors and users of financial reports. Accounting standards give guidance to chartered accountants who prepare financial reports, and there are usually separate standards for not-for-profit organisations as compared to commercial organisations. Auditing standards give guidance to auditors. Professional and ethical standards give guidance on values and ethics.
Each country sets its own standards through the laws and regulations of the land. However, given the international nature of financial activity, it is useful for jurisdictions to have consistency. This need led to the development of International Financial Standards.
In Aotearoa New Zealand, standards were originally set by the professional body for chartered accountants. However, in 1993 that function was moved to an independent statutory body called the External Reporting Board. The Board was set up by the Financial Reporting Act, which provides the framework for its roles and responsibilities.
Standards are set through a process that requires both research, and consultation with affected parties. This structure adds another layer of independence to financial audits, along with better transparency.
Our jurisdiction uses the International Financial Standards as a basis for local standards. Publicly listed companies are required to comply with these, including foreign companies listed on the New Zealand Stock Exchange.
Similar structures are required for algorithmic audits. A complex process and structure is needed for the complexity of issues and harms that could occur. For the financial system, harms are economic. With algorithmic audits, harms relate to personal safety and in some cases, to life itself.
There are many and significant technological challenges when it comes to algorithms, but this is equally true for financial systems. Many large organisations have bespoke financial software, which auditors must have proficiency in. They need to understand the risk points for that particular software, requiring a depth of expertise which may not be transferable to another organisation.
A critical consideration is who will pay for the infrastructure required for an audit process. For financial audits there are various mechanisms that fund the process. Chartered accountants and auditors pay fees to their professional bodies, and in some jurisdictions, part of these may be used to set standards.
Statutory bodies are most often funded through general taxes, which allow for more independence in standard setting. It does put them at the mercy of the government budget process, and budget cuts based on changing government priorities. However, this is a more secure form of funding than many others.
The audits themselves are paid for by the organisation being audited. They are invoiced by the auditor based on the work required to be done. This is standard practice across many countries.
There are inherent risks in auditing, especially that the auditor might miss some significant error or piece of information that renders the financial reports unreliable. Auditors will do their best to ensure that their processes are strong enough to minimise the risk. They are still required to invest in professional liability insurance, and this is compulsory for any financial auditor registered in Aotearoa New Zealand.
There is a long history of civil action against auditors, particularly by those owed money by failed organisations. Auditors have been sued for negligence in detecting errors or fraud, or in ensuring disclosure of the lack of financial viability of an organisation. Legal action is taken against auditors to recover funds, because company directors and senior management are often bankrupted along with the organisation and there is no money available from that source.
For financial audits, the organisation being audited chooses their own auditor. This can undermine independence, and checks are required to ensure companies are unable to change auditors simply because the auditor was asking too many difficult questions or was requiring disclosures or changes that the organisation is unwilling to make.
It is neither healthy nor effective for organisations to choose an auditor that will give them an “easy” audit with minimal investigation. There are some ways to prevent this, such as requiring any potential new auditor to check with the previous auditor if there is any professional reason why they shouldn’t take on the job. Professional reasons would include these kinds of disputes. This is an area which can and should be strengthened.
Competition for audit services does help to keep costs reasonable, so organisations need the flexibility of being able to change auditors. A change in auditors is also recommended after a period of time, to ensure a fresh set of eyes and to mitigate the risk of strong personal relationships having an impact on independence.
Audit costs can be a burden for smaller organisations. This is why standards may provide different levels of assurance for a small organisation. The not-for-profit standards in Aotearoa New Zealand allow smaller organisations to opt for a review, rather than a full audit. A review will have a lower level of assurance, but there is lower risk of financial loss for a smaller organisation.
This would not be the case for algorithmic audits. A small organisation can cause significant harm to a small group of people, and if that harm is violence (including emotional violence) or loss of life, then the loss is not negligible or minor. In this case, there will be a requirement for an additional funding source, to ensure that audits for all organisations are robust.
Effective auditing requires investment in a robust infrastructure, that includes standards, professional ethic, underlying principles, a professional body and registered auditors. It is a complex undertaking that is built on consultation, consensus and the public good.
Financial auditing provides an example of such an infrastructure and provides a model for the development of algorithmic audits. Such an infrastructure can help provide assurance, while maintaining the required commercial sensitivity and privacy rights.
Algorithms have the potential to cause significant harm, and regulation is required. The audit process outlined above provides a framework for regulation, while allowing adaptability and flexibility. This is not a nice-to-have, but a necessity for the safety and protection of populations across the world.
Anjum Rahman is a co-chair of the Christchurch Call Advisory Network and a co-lead of the Inclusive Aotearoa Collective Tāhono. She has worked most of her life as a chartered accountant and brings this lens to a view of algorithmic audits.