The British government had on Tuesday, announced that creating and sharing sexually explicit “deepfakes” will soon become a criminal offence in the country. This is no doubt, an attempt to tackle a surge in the proliferation of such images, mainly targeting women and girls.
Deepfakes are videos, pictures or audio clips made with artificial intelligence (AI) to look real, and such technology can be used to digitally alter pornographic images into the likeness of someone else.
Publishing sexual photos or videos without consent and with the intent to cause grief aka (revenge porn) was criminalised in Britain in 2015, but the legislation does not cover the use of fake images which saw an upsurge since the popularity of AI grew.
Data from a UK-based Revenge Porn Helpline has revealed image-based abuse using deepfakes rose to over 400% since 2017.
Under the new offence to be instituted by the government, perpetrators could be charged and face prosecution for both generating and sharing these images.
The previous Conservative government, which lost power to the Labour Party in July, had announced similar plans to make sexually explicit deepfakes a criminal offence. Under its proposal, offenders would face fines and even jail sentences.
Further details of the new offence would be set out in due course, according to the justice ministry.
The government additionally said it would also create new offences for the taking of intimate images without consent and the installation of equipment with intent to commit these offences. Those found guilty might face up to two years behind bars.
The British technology minister, Margaret Jones for her part, said tech platforms hosting abusive images would face tougher scrutiny and significant penalties.
These new offences will be included in the government’s Crime and Policing Bill, which will be launched to parliament. At this time however, a date is yet to be set.