Post by account_disabled on Feb 15, 2024 4:01:54 GMT -6
But three key areas of the bill could be improved. 1. Loose definition of “ artificial intelligence system ” According to the draft, “ artificial intelligence system ” will be defined in the law as “ a machine-based system designed to operate with varying degrees of autonomy and which, upon deployment, may Demonstrates adaptability and, for explicit or implicit goals, how the inputs it receives from it produce outputs that can influence the physical or virtual environment, such as predictions, content, recommendations, or decisions.
" Proposed " artificial intelligence system " The definition seems loose. Cyprus Phone Number List The phrases " machine-based systems " and " affecting a physical or virtual environment " arguably apply to all machines and software, not just machine learning (which is what we really mean when we say " artificial intelligence . " ) The only key word that distinguishes simple software from artificial intelligence is " inference , " but it's not clear what that means . This ambiguity becomes tricky when dealing with software that does not use machine learning methods but still results in high-stakes decisions / outputs. For example, from 2016-2019 , the Australian government deployed an automated system called Robodebt . It is designed to calculate outstanding taxpayer debt by comparing data from different sources of income.
Technically it didn't use any machine learning, miscalculated, resulting in more than 500,000 erroneously issued debt notices. Robodebt 's mistakes caused unnecessary stress to many taxpayers and even included some tragic suicides. But according to the bill, Robodebt did not " make inferences based on input received " as the AI system requires . So, will a system like this be subject to the EU Artificial Intelligence Act ? That should be the case, but vague language may cause it to fly under the radar. 2. Drawbacks of GPAI model rules The problem with GPAI models is that the same model can be used for safe or dangerous applications.
" Proposed " artificial intelligence system " The definition seems loose. Cyprus Phone Number List The phrases " machine-based systems " and " affecting a physical or virtual environment " arguably apply to all machines and software, not just machine learning (which is what we really mean when we say " artificial intelligence . " ) The only key word that distinguishes simple software from artificial intelligence is " inference , " but it's not clear what that means . This ambiguity becomes tricky when dealing with software that does not use machine learning methods but still results in high-stakes decisions / outputs. For example, from 2016-2019 , the Australian government deployed an automated system called Robodebt . It is designed to calculate outstanding taxpayer debt by comparing data from different sources of income.
Technically it didn't use any machine learning, miscalculated, resulting in more than 500,000 erroneously issued debt notices. Robodebt 's mistakes caused unnecessary stress to many taxpayers and even included some tragic suicides. But according to the bill, Robodebt did not " make inferences based on input received " as the AI system requires . So, will a system like this be subject to the EU Artificial Intelligence Act ? That should be the case, but vague language may cause it to fly under the radar. 2. Drawbacks of GPAI model rules The problem with GPAI models is that the same model can be used for safe or dangerous applications.