Sully Deal and Bulli Bai | How much responsibility should GitHub take? – India Times English News

Panic, anger, disbelief and resignation. Then, just constant anger.

A commercial pilot, Hana Mohsin Khan, first realized it after learning it had been “auctioned” in July 2020 on the Sully Deals app hosted on US-based software collaboration and hosting platform GitHub. “I used to be a really happy person. It all happened because I am a Muslim and a thinking woman,” she told Moneycontrol.

After six months of agony, when she was ready to leave everything behind and start anew in 2022, another app, Bully Buy, came to the fore and on January 1, more than 100 women were auctioned off. “I was not a part of Bulli Bai, but it brings memories and the feeling that it will never stop. There is no progress, no hope,” she said.

What really bothers her and many other women and digital rights activists is the total lack of accountability from GitHub, where the two apps were hosted. “Platforms like GitHub need to take responsibility. We need more security and scrutiny and now is the time to do something about it.

The recent incident has highlighted the need for transparency on Microsoft-owned GitHub in moderation, or lack thereof, and how these issues are addressed.

GitHub

Founded in 2008 in the US, GitHub has been dubbed as a social network for developers. Millions of developers and hundreds of organizations use the platform to host software projects.

There are about 73 million developers on GitHub, of whom 16 million joined in 2021 alone. The platform has around 5.8 million users in India, which is one of its fastest growing markets.

Hundreds of developers upload applications every day, and edit and collaborate with peers around the world. As a Bengaluru-based security researcher points out, most of the information on the platform is in Java and other programming languages, as opposed to pictures or text. The programming behind many of these applications is public, and this makes the platform vulnerable to moderation issues not only in India but globally.

For example, the web developer, Sami, detailed on Twitter how the Bully Buy app used the same source code as Sully Deal. “The one who created the GitHub page ‘SullyDeals’ is also the same person who now created the Github page ‘Bullybaby’. It looks like they’ve rewritten the texts on the page, but it’s the same code, ‘Sully’ in the code There is only one function with the name.”

Raj Pagaria, a technology lawyer and partner-client relations partner at The Cyber ​​Blog India, says there is a bigger issue on the platform’s efforts to ensure that incidents that violate its community guidelines are not repeated, as Sully said. In the case of the deal. And Bully Bai.

“When it happens once, it makes sense. But when it happens twice there is trouble. The fact is that the platform did not try hard enough to ensure that there was no recurrence,” he said .

not once not twice

In July 2021, around 80 women were auctioned off on Sulli Deals, hosted on GitHub.

In the following days, FIRs were registered in Uttar Pradesh and New Delhi. But so far no progress has been made in the investigation of the case. Khan, who filed the complaint in Noida, said that despite repeated follow-ups, no action was taken and many victims gave up.

Getting support from GitHub in criminal proceedings is also a challenge.

Anoushka Jain, Associate Counsel for Monitoring and Transparency of the Internet Freedom Foundation, explained that for any information to be obtained from GitHub for criminal proceedings, it has to be done through the MLAT agreement between India and the US. A Mutual Legal Assistance Treaty (MLAT) is an agreement between two or more countries for the purpose of collecting and exchanging information in an effort to enforce public or criminal laws.

When things were calming down, Bully Buy, another app hosted on Github, surfaced on January 1, auctioning photos of 100 women. The app was immediately taken down.

However, unlike last time, the response has been huge. Police acted swiftly with politicians stepping in in an FIR registered in Mumbai. Four people have been arrested so far, all of them students – an engineering student from Bengaluru, two from Uttarakhand and one from Assam.

While prompt action by law enforcement has been encouraging, many pointed out that it does not address the core issue, i.e., how GitHub addresses such issues.

question of patience

“If you look at platforms like GitHub, they are quite large in size and for any large platform, content moderation is a struggle,” Pagaria said.

“But if you are older, one would expect the platform to do something to prevent the Sully deal from happening a second time. But that didn’t happen; only 6 months later, it came up again,” he said.

Padmini Ray Murray, founder of tech and design collective Design Beku, told Moneycontrol that when two apps use the same code, there should be a check in the platform so that the other app doesn’t. “But they haven’t done anything, or if they did, we don’t know what it is. More transparency is needed,” she said.

Akanksha S Srivastava, founder of Akanksha Against Harassment, which works with law enforcement on cyberbullying, said: “For the platform, just blocking them is not enough. They have a responsibility and they should take preventive action. Also the response from GitHub needs to be better. ,

According to experts, the time has come for platforms to have better moderation tools, and be more transparent about how they address these concerns.

content moderation

Content moderation in general is a slippery slope. But unlike social media platforms, where they are required to comply with Indian rules and regulations, GitHub gets into a thorny area.

Under the new IT rules, all important social media intermediaries must have a grievance, compliance and zonal officer appointed in India. From Facebook to local social media apps, they all now have local authorities to cooperate with the government, and users can get in touch with complaints.

But GitHub does not have similar authorities in India and victims have to take a legal route to get information from the platform.

Recently, law student Amar Banka sent a legal notice to GitHub on the issue. He posted the response he got from GitHub on Twitter. “Foreign enforcement officers who wish to request information from GitHub should contact the United States Department of Justice Criminal Division’s Office of International Affairs. Via the GitHub Mutual Legal Assistance Treaty (MLAT) or a US Court of Justice through Letter Rogatory Will respond promptly to requests issued from

A digital rights activist, who spoke on condition of anonymity, said the MLAT is just for show and doesn’t work in most cases.

Moneycontrol sent GitHub detailed questions on moderation, compliance with local laws, its complaints and the level of cooperation extended by compliance authorities and the government in India.

GitHub did not respond to specific questions, but shared a statement: “GitHub has long had policies against harassment, discrimination, and promotion of violence against content and conduct. After we investigated reports of such activity Deleted a user account. Suspended, all of which violate our policies.”

To ensure this, GitHub has set policies prohibiting unlawful, defamatory and abusive content targeting any individual or group, as was the case with Sulli Deals and Bulli Bai. The platform removes content when they violate community guidelines and are reported.

It brings to the fore whether moderating a programming language is a way to go, given that this isn’t the first time the company has had trouble with moderation.

History of Moderation Issues

In 2014, India blocked 32 sites, including GitHub, for hosting ISIS-related content. Globally, the platform has come under scrutiny for hosting code that allows people to create deepfakes. Deepfakes can be used to create non-consensual pornographic videos, which are infringing. The platform has also been censored in other countries including China and Russia.

While victims have called for better moderation, this raises the question of how far a platform should go to moderate content.

The security researcher cited earlier, who works for Bengaluru-based Unicorn, explained that actively moderating code hosted by millions of developers would be a challenge as most of the information on the platform is in Java and other programming languages. “Therefore, in order to moderate it, one has to go behind each project to find out whether it violates policies. When there are hundreds of thousands of codes, it is not possible to moderate each one,” The researcher said.

read all today’s fresh news, todays fresh news And coronavirus news Here.