We are no
longer dealing with hypotheticals or science fiction, Artificial Intelligence
is being utilised as a weapon of war to kill people with an apathetic level of
oversight.
This week,
Yuval Abraham, an Israeli citizen from +972 Magazine, in collaboration with Local
Call, has put together a damning article that highlights the brutality and
callousness of the IDF’s war against the Palestinian people.
Here are the
highlights:
·
The
artificial intelligence (AI) system known as Lavender is being used to
automatically generate kill lists for the Israeli military.
·
The
AI system marks all people suspected of being Palestinian militants and generates
a list of bombing targets within Gaza.
·
This
system was adopted during the early stages of the war, and 37,000 Palestinians
were marked for death, with no process in place to double-check the machine's conclusions.
·
This
system was adopted despite acknowledgments that at least 10% of the people
placed on these target lists were placed in error by the machine.
·
This
AI system was used in concurrence with other systems, such as Where’s Daddy?
which was designed to signal to the IDF when their people entered their family
residence.
·
Due
to the expense of guided missiles, larger and more collateral-inducing dumb
bombs are extensively used, with the outcome being that entire buildings and
their inhabitants have been marked for death because a machine determined that
an enemy operative was in the building.
·
The
IDF has internally permitted up to 20 collateral kills per enemy combatant destroyed
by these raids, and for enemy commanders, 100 civilian deaths are permitted.
Here is a
list of statements from the IDF soldiers the journalist interviewed:
At 5 a.m., [the air force] would come and bomb all the houses
that we had marked… We took out thousands of people. We didn’t go through them
one by one — we put everything into automated systems, and as soon as one of
[the marked individuals] was at home, he immediately became a target. We bombed
him and his house…
It was very surprising for me that we were asked to bomb a
house to kill a ground soldier, whose importance in the fighting was so low… I
nicknamed those targets ‘garbage targets.’ Still, I found them more ethical
than the targets that we bombed just for ‘deterrence’ — highrises that are
evacuated and toppled just to cause destruction.
A human being had to [verify the target] for just a few
seconds… At first, we did checks to ensure that the machine didn’t get
confused. But at some point we relied on the automatic system, and we only
checked that [the target] was a man — that was enough. It doesn’t take a long
time to tell if someone has a male or a female voice.
if the [Hamas] target gave [his phone] to his son, his older
brother, or just a random man. That person will be bombed in his house with his
family. This happened often. These were most of the mistakes caused by
Lavender,
Let’s say you calculate [that there is one] Hamas [operative]
plus 10 [civilians in the house],… Usually, these 10 will be women and
children. So absurdly, it turns out that most of the people you killed were
women and children.
It was like that with all the junior targets… The only
question was, is it possible to attack the building in terms of collateral
damage? Because we usually carried out the attacks with dumb bombs, and that
meant literally destroying the whole house on top of its occupants. But even if
an attack is averted, you don’t care — you immediately move on to the next
target. Because of the system, the targets never end. You have another 36,000
waiting.
At first we attacked almost without considering collateral
damage… In practice, you didn’t really count people [in each house that is
bombed], because you couldn’t really tell if they’re at home or not. After a
week, restrictions on collateral damage began. The number dropped [from 15] to
five, which made it really difficult for us to attack, because if the whole
family was home, we couldn’t bomb it. Then they raised the number again.
In the bombing of the commander of the Shuja’iya Battalion,
we knew that we would kill over 100 civilians… For me, psychologically, it was
unusual. Over 100 civilians — it crosses some red line.
Sometimes [the target] was at home earlier, and then at night
he went to sleep somewhere else, say underground, and you didn’t know about
it,” one of the sources said. “There are times when you double-check the
location, and there are times when you just say, ‘Okay, he was in the house in
the last few hours, so you can just bomb.
They are
using this new technology in Gaza because the Palestinians can be used as Guinea
pigs by the IDF and the American military-industrial complex. This technology
is a gross violation of the rules of engagement and helps explain why 33,000+ Gazans
have been killed in this war so far, the highest death toll of any 21st-century conflict to date.
Soon, AI
will be utilised by all the military-industrial complexes of the world to generate
these assassination-bombing lists. War truly lacks humanity.