Microblogging and social networking website Tumblr which faced a years-long struggle with approval on the iOS App Store, has said that they have made fresh changes in order to remain on the App Store.


In 2018, Tumblr’s iOS app was taken down from the App Store under the child sexual abuse material (CSAM) policy.





A month later, the platform reacted by banning all porn and other sexually-explicit content, resulting in a 29 per cent monthly traffic decrease.


Since then, the platform’s web traffic has remained relatively stagnant, reports The Verge.


“In order for us to remain in Apple’s App Store and for our Tumblr iOS app to be available, we needed to make changes that would help us be more compliant with their policies around sensitive content,” Tumblr has said in a latest blog post.


Many Tumblr users come to the platform to talk anonymously about their experiences.


The platform said that “for those of you who access Tumblr through our iOS app, we wanted to share that starting today you may see some differences for search terms and recommended content that can contain specific types of sensitive content”.


“In order to comply with Apple’s App Store Guidelines, we are having to adjust, in the near term, what youa¿re able to access as it relates to potentially sensitive content while using the iOS app,” said the platform.


To remain available within Apple’s App Store, the company had to extend the definition of what sensitive content is as well as the way its users access it in order to comply with guidelines.


“We understand that, for some of you, these changes may be very frustrating – we understand that frustration and we are sorry for any disruption that these changes may cause,” said Tumblr.


Apple’s CSAM feature is intended to protect children from predators who use communication tools to recruit and exploit them.


It is part of the features including scanning users’ iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos and expanded CSAM guidance in Siri and Search.


–IANS


na/ksk/

(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)

Dear Reader,

Business Standard has always strived hard to provide up-to-date information and commentary on developments that are of interest to you and have wider political and economic implications for the country and the world. Your encouragement and constant feedback on how to improve our offering have only made our resolve and commitment to these ideals stronger. Even during these difficult times arising out of Covid-19, we continue to remain committed to keeping you informed and updated with credible news, authoritative views and incisive commentary on topical issues of relevance.

We, however, have a request.

As we battle the economic impact of the pandemic, we need your support even more, so that we can continue to offer you more quality content. Our subscription model has seen an encouraging response from many of you, who have subscribed to our online content. More subscription to our online content can only help us achieve the goals of offering you even better and more relevant content. We believe in free, fair and credible journalism. Your support through more subscriptions can help us practise the journalism to which we are committed.

Support quality journalism and subscribe to Business Standard.

Digital Editor





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *