HomeTagsPosts tagged with "dangerous"

dangerous

by

If you've been on your social media accounts recently, you most likely can't avoid the new fad; the Ten Year Challenge.

Celebrities from Mariah Carey, Trevor Noah, Amy Schumer to Caitlyn Jenner participated in the glow up experiment, with most famous faces simply proving how freakishly ageless they are.

Some participants brought humour into the fray, paralleling their image with one of another look-alike celebrity, or in Jenner's case, changing gender over the last 10 years.

 
 
 
 
 
 
 
 
 
 
 
 
 

Now THAT is a #10YearChallenge Be authentic to yourself 

A post shared by Caitlyn Jenner (@caitlynjenner) on

The concept of then-and-now images isn't exactly new, but it's gained massive traction over the last week. What harm could it be?

Kate O'Neill of Wired magazine introduced a new notion which essentially blew our minds, and even forced Facebook to deny her semi-sarcastic suggestion.

Her idea? That the 10 Year Challenge could be useful to any entity that’s looking to develop facial recognition algorithms about ageing.

O'Neill flipped a metaphorical table by suggesting the tech giant had initiated a trend solely to contract facial recognition data from the social network's users.

In her article, Facebook's 10 Year Challenge Is Just A Harmless Meme- Right?, she claims;

"I knew the facial recognition scenario was broadly plausible and indicative of a trend that people should be aware of. It’s worth considering the depth and breadth of the personal data we share without reservations."

Allegedly, the conspiracy translates to Facebook needing to experiment with data, and the meme proving the perfect way to achieve it.

"Imagine that you wanted to train a facial recognition algorithm on age-related characteristics and, more specifically, on age progression (e.g., how people are likely to look as they get older)," she added. 

"Ideally, you'd want a broad and rigorous dataset with lots of people's pictures. It would help if you knew they were taken a fixed number of years apart—say, 10 years." WHAT.

O'Neill is saying that the powerful technology company could use the algorithm for advertising, insurance assessment, healthcare and finding missing children. Both positive but simultaneously dangerous consequences.

Of course, this is all total speculation, unsubstantiated evidence. Yet Facebook was forced to dispel the rumours:

Do we place too much trust in sites like Facebook? Even if the challenge isn't a case of social engineering, the website has come under fire following numerous controversial claims against them.

Examples of social games designed to extract data aren't far from reality, let's cast our minds back to the Cambridge Analytica scandal.

The mass data extraction of over 70 million American Facebook users was performed, and rocked the country so much that Mark Zuckerberg himself had to turn up to Congress.

Another aspect of the website which garners negative attention is their suspicious community guidelines which seem to apply more rigidly to certain types of people.

Let's face it, Facebook is already heavily involved in politics, such as the critical 2016 US Presidential election and Russian interference.

According to Kate O'Neill, major tech corporations acquiring data could be used for population control and law-and-order;

"After Amazon introduced real-time facial recognition services in late 2016, they began selling those services to law enforcement and government agencies, such as the police departments in Orlando and Washington County, Oregon."

"But the technology raises major privacy concerns; the police could use the technology not only to track people who are suspected of having committed crimes, but also people who are not committing crimes, such as protesters and others whom the police deem a nuisance," she continued.

Facebook's implication in various privacy concerns has created a tumultuous relationship between the tech giant and its users.

O'Neill is definitely right about one thing- data is one of the most powerful currencies, so don't spend it dangerously.

“Regardless of the origin or intent behind this meme, we must all become savvier about the data we create and share, the access we grant to it, and the implications for its use."

Feature image credit: Mamamia

Trending

Hashtags which might be promoting eating disorders on Instagram have now been placed on an 'unsearchables' list following an investigation.

It was discovered that users of the photo-sharing network were bypassing the platform's filters, and health warnings have since been added to several spellings or terms which reference eating disorders.

Many of these terms are popular hashtags on Instagram's platform, but if they are on the 'unsearchables' list then zero results will come up.

Since 2012, the site began making some terms unsearchable in an effort to avoid users being able to locate often upsetting graphic images and posts which encouraged the idea that eating disorders were part of a lifestyle rather than a mental disorder.

However, BBC Trending claim that certain terms are still searchable, include ones which promote bulimia, and that Instagram's search bars suggest different terminology and spellings for terms glamorising eating disorders.

The search box offered a shocking 38 alternative spellings in one such instance for a popular term promoting the disorders.

Instagram has now made several alternative terms unsearchable and have added many to the list of terms triggering the health warning. They also have said they will continue to attempt to restrict such content.

A spokesman on their behalf commented that;

"We do not tolerate content that encourages eating disorders and we use powerful tools and technologies – including in-app reporting and machine learning – to help identify and remove it," 

"However, we recognise this is a complex issue and we want people struggling with their mental health to be able to access support on Instagram when and where they need it."

"We, therefore, go beyond simply removing content and hashtags and take a holistic approach by offering people looking at or posting certain content the option to access tips and support, talk to a friend, or reach out directly" to support groups.

Social networks have begun to censor content which could possibly encourage eating disorders, yet many people online discovered a way to navigate around the filters through deliberately misspelled hashtags.

Instagram and most popular sites don't use moderators to proactively search for dangerous content, and relies on users alone to report violations of its rules.

Algorithms fail to detect the difference between positive and harmful content, and then offer advertising and suggested sites which are promoting an unhealthy mental health disorder.

Eating disorder charities are demanding that ocial media networks take more responsibility for policing their content.

Certain sites online and Instagram pages are supportive for survivors of eating disorders, and there is an argument that removing posts could cease discussions surrounding eating disorders, which is important.

The rules of Instagram prohibit posts which promote or glorify eating disorders, but the company has a long way to go to develop its safety policies.

Trending

by

Sex-dolls are creepy by nature, but the ones created by Orient Industry are so realistic looking that they’d send chills down your spine.

The company takes custom orders from people, meaning they can choose bust size, hair clour, eyes and everything else. And the silicone they use apparently feels like actual skin.

Apparently the dolls (called ‘Dutch Wives’) cost more than £1,000 each, and they are a BIG hit.

A spokesperson from Orient Industry said: “The two areas we identified as really needing improvement were the skin and the eyes.

“We feel we have finally got something that is arguably not distinguishable from the real thing.”

Hmmm…we’re not sure about how much of a good thing it is for sex dolls to look like actual women. Could they be spreading the message that women are just sexual playthings – even more than the ridiculous looking sex dolls do?

 

Trending

According to new research, eating a diet high in protein could be as dangerous to your health as smoking 20 a day.

The University of Southern California, who conducted the research, defined a high protein diet as being at least 20% of a person’s daily calories. The results showed that those under the age of 65 who consumed this amount of animal protein had an increased risk of developing cancer.

Study author Dr Valter Longo, Professor of Biogerontology at the USC Davis School of Gerontology said: “High levels of protein can be as bad for you as smoking. People should understand the distinction and be able to make the decision about what they eat.”

Red meat, milk, and cheese are reportedly the most harmful but fortunately, there was no evidence to suggest fish protein has any harmful effect.

Dr Longo added:  “Some proteins are better for you than others, for example plant-based proteins like beans. Vegans seem to do better in studies than those who eat animal based proteins. Red meat always comes out top as the worst and that’s probably due to its other components.”

Unfortunately, the results will put a shadow over the benefits of high-protein diets like the Paleo diet.

Trending

A new study showed that people who live their lives at the same speed os a tortoise, have a higher chance of an early grave .

While looking at the reaction time of 5,000 men and women, researchers found that slow people are 25% more likely to die in the next 25 years – regardless of their age.

It’s so worrying that experts are now saying the lack of alertness could be as dangerous as smoking.

They also think that a slow reaction could indicate something more serious in the body, since the speed of your brain can act as a mirror of what’s happening inside you – or not happening.

The results of the study showed that slow reactions can even be linked with heart attack, strokes and car accidents.

And we thought being chilled out was a good thing…

Trending