News / Legal Brief

Raging against the machine

May 9,2023

Natalie Scott - Head of Sustainability, Anastasia Vatalidis - Director and Janice Geel - Associate

and Thembelihle Tshabalala, Candidate Attorney,


The meteoric rise of artificial intelligence (AI) is generating infinite possibilities and particularly so when linked to or combined with other technologies, such as neurotechnology. The United Nations Educational, Scientific and Cultural Organisation (UNESCO) describes “neurotechnology” as the use of any electronic device to read or modify the activity of neurons (being the specialised cells that transmit nerve impulses throughout the body) in the nervous systems.[1]

The electronic devices typically used in neurotechnology include invasive technology (such as placing micro‑electrodes or other neurotechnological material directly onto the brain) or non-invasive technology (such as magnetic resonance imaging to map brain activity to identify brain tumours, strokes and developmental problems[2] and wearables, which include smart watches, headphones, ear phones and virtual reality headsets to monitor the user’s heartrate, stress level, physical activity and behavioural patterns)[3] and which when combined with AI, can lead to a collection of neurological, physiological and cognitive information, which can be used to interpret the user’s personal information. Inferences made in the interpretation of such information, may be subjective or objective, depending on the identity of the interpreter and the purpose for which the technology is being used.

So, what does this mean for the user’s, mental privacy[4] and other cognitive and human rights?

UNESCO, in its publication entitled ‘The risks and challenges of neurotechnologies for human rights‘ states that neurotechnology, on the one hand, has immense potential to improve learning and cognition, facilitated by thought-to-text creation and virtual and augmented reality supported by brain control, to name a few.[5]

On the other hand, UNESCO warns that these advancements present a novel ethical and human rights dilemma, specifically insofar as it relates to the need to introduce specific human rights to prevent impairments to the user’s cognitive liberties and mental privacy.[6] It further notes that any advances in neurotechnological applications must consider the potential consequences for the user’s autonomy, privacy, consent, integrity and dignity, as the users may not always be aware that –

  1. their neurological and other information is being processed and
  2. their neurological and other information is used in conjunction with AI (underpinned by an algorithm or many algorithms) to make inferences on the user’s behaviour, emotions, cognitive abilities, productivity and to predict the user’s decisions.[7] The algorithm/s (used in conjunction with the neurological information) may also introduce biases against users which may lead to unchecked discrimination against the user purely based on the algorithm/s and for purposes unknown or disclosed to the user.

Some companies,[8] from as early as 2014, have integrated the use of neurotechnology in the workplace and it is unclear whether the employees of these companies were, at all times, aware that their employers were monitoring, amongst other information, their neurological information.[9] The aforementioned companies have made and continue to make use of some of the following neurotechnologies –

  • wireless sensors in employees’ hats working in production lines, which are combined with AI algorithms to detect workplace rage, anxiety and sadness (constituting what is referred to as “emotional surveillance technology”) to –
    • (i) enhance workflow,
    • (ii) suggest that an employee be placed on a break or on leave by the employer or
    • (iii) re‑assign the employee to a less critical task;[10]
  • sensors built into the caps of high-speed train drivers to trigger an alarm if a driver falls asleep;[11] and
  • SmartCaps, wired with electroencephalograms, to monitor employees’ brainwaves in order to monitor their level of fatigue and relay the neurological information to the employee in real-time to potentially prevent injuries.[12]

In addition to the obvious purpose of the neurotechnologies, employers may use the neurological information of employees for more subtle purposes, such as to inform their decisions on promotions, retrenchments and dismissals. Further, the neurological information of candidates applying for an employment position may, even if not consented to, be used to differentiate and discriminate applicants by prospective employers and which could lead to unfair discrimination or unfair labour practices.

Although the introduction of neurotechnology within the workplace has the potential to transform workplaces in order to make them more efficient, the application of neurotechnology in the workplace may prove to be problematic given the current South African labour legislative framework.

Section 7 of the Employment Equity Act 55 of 1998 (EEA), prohibits medical testing by employers of its employees unless legislation permits and/ or requires the medical test to be conducted or unless the medical test is justifiable in light of medical facts, the employee’s employment conditions, social policy, the fair distribution of employees’ benefits and the inherent requirements of the job. For the purposes of the EEA, medical testing is defined as “any test, question, inquiry or other means designed to ascertain, or which has the effect of enabling the employer to ascertain, whether an employee has any medical condition“.[13] A medical condition is defined as “the state of a patient’s physical or mental health, including the patient’s illness, injury or disease“.[14]

In addition, section 8 of the EEA, prohibits psychological testing and other similar assessments unless the test is, inter alia, scientifically shown to be valid and reliable, is applied fairly to all employees and is certified by the Health Professions Council of South Africa (HPCSA) or any other body which may be authorised by law to certify the tests or assessments. Despite not being defined in the EEA, psychological testing includes, the conducting of psychometric tests by applicants for employment as part of the recruitment process. It is to be noted that the prohibitions to medical and psychological testing also extend to applicants for employment.[15] The testing which is prohibited in terms of the EEA would give rise to unfair discrimination and the applicant/ employee would be entitled to approach the Labour Court for an order prohibiting the employer from using such test, alternatively for compensation.

The application of neurotechnologies to measure and assess, inter alia, emotional wellbeing, fatigue, health and cognitive performance may, depending on the test conducted, constitute medical and/ or psychological testing for the purposes of the EEA.

In the event that employers intend to implement said tests in the workplace, they would, as a first step, need to determine whether the test constitutes a medical test or a psychological test. If the AI test falls within the ambit of a psychological test, the employer would need to ensure that the test meets the standards of the HPCSA, as a minimum, and complies with the balance of the requirements in section 8 of EEA. To the extent that it fails to comply with these requirements, the test would not be permissible in terms of the EEA.

If it is established that the AI tests constitute medical tests, then employers will be faced with a far higher hurdle to overcome before being entitled to use such tests. As a point of departure, the employer would have to show that the test is permissible in terms of legislation. For example, there are certain industries where medical testing is permitted in terms of legislation such as, inter alia, the Occupational Health and Safety Act 85 of 1993 and the Mine Health & Safety Act 26 of 1996.

In workplaces governed by this legislation, depending on the extent of the health and safety risks as identified in a specific workplace health and safety risk assessment, employers are permitted to conduct breathalyser tests. Breathalyser tests, although geared towards determining the presence and amount of alcohol in the bloodstream and the psychological and cognitive impact that this will have on the employee’s competence and/ or ability to perform their duties, the actual test conducted is a medical test. Parallels could be drawn between breathalyser tests and the tests conducted using AI and neurotechnologies which measure rage, anxiety, sadness, exhaustion and fatigue and consequently, these types of tests are more likely be constitute medical tests than psychological tests.

In circumstances where an employer cannot rely on legislation to justify the use of a particular neurotechnology or AI medical test, the employer could nevertheless conduct such medical tests where the test is justifiable in light of medical facts, employment conditions, social policy or the inherent requirements of the job. However, other than in respect of the inherent requirements of a job (which criteria has been the subject of a number of matters that have come before the Labour Courts) the remaining justifiable criteria in terms of the EEA remain largely untested by the Labour Court. Consequently, an employer who relies on these largely untested justifications in order to validate the use of neurotechnology or AI tests will not be in a position to rely on established authorities to support the use of such tests.

Assuming the employer can justify the use of the test, either in terms of section 7 or section 8 of the EEA, the employer would nevertheless also be required to obtain the employee’s or applicant’s consent. The employer would need to ensure that the test conducted measures only that which the employer requires and does not extend beyond this scope by collecting information which the employee or applicant has either not consented to or for which there is no work‑related justification. The employer would also need to ensure that the data acquired is applied consistently to all employees and does not discriminate against any particular group/s either when conducting the tests, when interpreting the data acquired or when using the data to deal with employees.

Over and above the employment law related hurdles arising from the use of neurotechnology in the workplace, there exists the additional challenge of the processing of this personal information which would invoke the Protection of Personal Information Act 4 of 2013 (POPI Act). This is due to the data being collected constituting the personal information of the employee for which there are specific requirements that need to be complied with prior to the employer being able to process and/ or store this information in line with the POPI Act.

POPIA: A Guide to the Protection of Personal Information Act of South Africa

The potential applications of neurotechnology and AI are infinite and not isolated to employers using the neurological and other information received to assess and discriminate against their employees and/ or applicants for employment. Neurotechnology is becoming more prevalent in smart watches, earphones and wireless earbuds that can measure brain and electrical activity in the human body, and gather neurological information on its user, for purposes that may or may not be disclosed to the user and in respect of which they may or may not have provided their express consent to process, in part or in whole.

Given the speed with which neurotechnology and AI are taking over the day‑to‑day activities of both personal and work life, the South African legal landscape, which has traditionally been slow to respond to technological advancements, will need to evolve with greater speed in order reform and keep up with the social and technological changes which are permeating the world we live in. With this being said, a balance would need to be struck between protecting humans and their mental capacity, business efficiencies, neurological and cognitive advancements and the increasingly invasive impact of these machines on society.


[1] “The risks and challenges of neurotechnologies for human rights” by the United Nations Educational, Scientific and Cultural Organisation published in 2023
[2] See “What the rise of neurotechnology means for the future of healthcare
[3] See “The Ethical Challenges of Connecting Our Brains to Computers” (accessed on 21 April 2023)
[4] The National Library of Medicine defines “mental privacy” as “the idea that we should have control over access to our neural data and to the information about our mental processes and states that can be obtained by analysing it”, see Is Mental Privacy a Component of Personal Identity? (accessed on 21 April 2023)
[5] See page 7 of “The risks and challenges of neurotechnologies for human rights” by the United Nations Educational, Scientific and Cultural Organisation published in 2023
[6] See page 8 of “The risks and challenges of neurotechnologies for human rights” by the United Nations Educational, Scientific and Cultural Organisation published in 2023
[7] See Neurotech at Work  (accessed on 21 April 2023)
[8] See These Chinese workers’ brain waves are being monitored and Australian Employees Make Use of SmartCaps for Brain Monitoring in the Workplace (accessed on 21 April 2023)
[9] See  These Chinese workers’ brain waves are being monitored (accessed on 21 April 2023)
[10] See Forget the Facebook leak’: China is mining data directly from workers’ brains on an industrial scale (accessed 21 April 2023)
[11] See These Chinese workers’ brain waves are being monitored  (accessed on 21 April 2023)
[12] See Australian Employees Make Use of SmartCaps for Brain Monitoring in the Workplace (accessed on 21 April 2023)
[13] Section 1 of the EEA definition of medical testing.
[14] Medical condition definition (Accessed 8 May 2023).
[15] Section 9 of the Employment Equity Act 55 of 1998.