Of the main challenges with autonomous weapon systems lies in the difficulty of anticipating their effects. From a humanitarian perspective, they risk harming civilians, and they increase the risk of conflict escalation. From a legal perspective, they challenge the ability to comply with legal obligations during planning of the attacks. From an ethical perspective, this process of functioning risks effectively substituting human decisions about life and death with sensor, software and machine processes. (Source ICRC , Autonomous weapons: The ICRC recommends adopting new rules, August 2021). In her presentation Julie Tenenbaum, a senior legal advisor at the International Committee of the Red Cross, will address those issues while discussing concrete examples and reflecting on the legal framework that can limit those risks.
Julie Tenenbaum
Julie Tenenbaum works for the International Committee of the Red Cross as the Regional Legal Adviser for Europe based in Paris. She qualified as a solicitor in England and worked in the legal field in London, with an emphasis on immigration and refugee law. She then joined the ICRC and worked as a protection delegate in Sri Lanka for a year and as Regional Legal Adviser for Southern and then Western Africa for almost four years. In that capacity, she advised national authorities on International Humanitarian Law (IHL) and on its national implementation. At the end of 2014, she took up the Regional Legal Adviser position for Europe and, since then, has been working on specific IHL issues as well as other legal issues of interest to the ICRC and relevant to the region.