Facebook將面臨2000000000000美元罰款!

這兩天科技圈一則消息鬧得沸沸揚揚,想必大家都知道了:Facebook爆出數據洩露醜聞,5000萬用戶數據遭洩露。

Facebook將面臨2000000000000美元罰款!

而更嚇人的是,短短兩天Facebook股價應聲大跌,市值縮水500億美元,創下Facebook的大跌記錄,甚至有媒體評價:Facebook正在生死邊緣。

Facebook將面臨2000000000000美元罰款!

當地時間3月19日,Facebook股價開盤後即出現陡狀下跌。

更驚人的是,這次的“生死邊緣”竟然並不是媒體誇張,因為如果這次的事情被查實,按照規定:Facebook將面臨高達2兆美元的天價罰款相當於目前臉書市值的近 4 倍!有錢如Facebook,也是賠不出這麼多錢的啊!所以,這可以說是真真正正的滅頂之災吶!

但是這件事情實在是有點奇怪,因為數據洩露這種事吧,雖然很嚴重,但是作為普通小用戶也不是第一次經歷了呀。互聯網時代,誰還沒被賣過信息?沒被賣過10次,至少也被賣過8次啊。怎麼這一次就這麼嚴重?一研究,就發現這事的嚴重程度跟以前的數據洩露還真是不可同日而語,幾乎有種Facebook版斯諾登的感覺。

Facebook將面臨2000000000000美元罰款!

簡單來說,這次的用戶洩漏被認為與操縱川普的總統選舉有關。比如說,小A喜歡在Facebook上瀏覽和槍支有關的信息,還時不時為最新消息點個贊。於是,這個“贊”就被後臺捕捉了,原來,這個人贊同持槍吶。過不了幾分鐘,“希拉里反對持槍”、“川普支持持槍”的消息,就會出現在小A的瀏覽頁面上。再比如,小B經常瀏覽選舉有關的界面,一會兒關注希拉里,一會兒關注特朗普,還時不時翻看他們的演講。這時候,後臺會迅速分析出,這個人還在猶豫,不知道該選誰!於是,他們就大量發佈讚美川普、反對希拉里的故事。於是,你以為是“無意”中,看到的有利於川普的新聞,可能最終影響了你去投了川普一票,而實際上,這根本就不是“無意”,而是被人刻意設置好的!難怪美國民眾,會有一種受到深深欺騙的感覺。。

而這背後在提供這種“精準投放”服務的,是一家在Facebook上的一家第三方數據分析公司——劍橋分析公司。這家公司甚至在私下宣稱,自己才是特朗普勝選的幕後功臣!但是呢,紙包不住火,這家公司的一位聯合創辦人,也許是因為承受不了良心的譴責,在前幾天爆料了這件事,這個人就是——克里斯托夫·威利。威利原本效力的這個劍橋分析公司是Facebook的一個第三方程序,當時為了使自己的平臺更多元化,Facebook邀請了一群第三方公司入駐建立小遊戲或小測試。於是,在這樣的大背景下,劍橋分析公司瞄準了這一契機推出了一款性格測試的APP,聲稱是“心理學家用於做研究的App”。那會兒還是2014年,大眾對數據的保護意識還不強,一看是心理學家專用的APP紛紛前來測試。就這樣,27萬臉書用戶躍躍欲試將自己的姓名年齡籍貫、興趣愛好業餘活動,統統報告給了這款App。

Facebook將面臨2000000000000美元罰款!

就這樣,這款測試軟件絲毫不費吹灰之力就拿到用戶第一手數據。不過如果只侷限於這27萬人,群體樣本太小任你多有想法也掀不起什麼大浪來。可劍橋分析公司怎會滿足於這點小樣本。於是,他們很快想出了新玩法:他們先在Facebook上發佈一則廣告,聲稱他們聯合劍橋大學心理學教授亞歷山大·科根一起研發了一款軟件,只要你做完這個測試,就給你賬戶發5美元。用戶打開軟件準備下載時會發現,這5美元不是想拿想拿就能拿的。因為,要想成功下載這款APP,你得有至少185名好友。於是為了迅速拿到這5美元,每個想要下載App的用戶,都紛紛開啟了加好友模式,小算一下這款App用滾雪球的方式將原本只有27萬人的小樣本迅速擴展至5000萬人。如果你是這27萬人的好友,那麼你在臉書上的發帖、點贊等行為,都會被劍橋分析公司偷偷獲取。換句話說,你的信息能否被蒐集,決定權不在於自己,而在於你的好友。

接下來,劍橋分析公司就開始下一步操作了。他們開始發放問卷調查,並在末尾偷偷設置一個選項,請求用戶同意該軟件查看臉書資料。一旦點擊了“同意”,他們就開始神不知鬼不覺的行動了,藉助Facebook上實名註冊的用戶,開始推送有利於川普大選的信息!

看到自己的信息就這樣洩漏,不少民眾都怒了,並且將怒氣撒在了Facebook身上,認為這件事都怪Facebook不作為。確實,這一波Facebook不冤。因為,根據威利的透露,Facebook自始至終都知道這件事。早在2014年,他們就監測到劍橋分析大量獲取用戶資料的異常行為。

Facebook CEO扎克伯格北京時間3月22日早間發文稱:“我們犯了錯誤,還有更多的事要做,正在弄清狀況本人對Facebook發生的事情負責,將調查所有有權限獲取數據的應用。”

以下為扎克伯格發文全文:

I want to share an update on the Cambridge Analytica situation -- including the steps we’ve already taken and our next steps to address this important issue.

We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you. I’ve been working to understand exactly what haPPened and how to make sure this doesn’t happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there’s more to do, and we need to step up and do it.

Here’s a timeline of the events:

In 2007, we launched the Facebook Platform with the vision that more apps should be social. Your calendar should be able to show your friends’ birthdays, your maps should show where your friends live, and your address book should show their pictures. To do this, we enabled people to log into apps and share who their friends were and some information about them.

In 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by around 300,000 people who shared their data as well as some of their friends’ data. Given the way our platform worked at the time this meant Kogan was able to access tens of millions of their friends’ data.

In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access. Most importantly, apps like Kogan’s could no longer ask for data about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from us before they could request any sensitive data from people. These actions would prevent any app like Kogan’s from being able to access so much data today.

In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica. It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and Cambridge Analytica formally certify that they had deleted all improperly acquired data. They provided these certifications.

Last week, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services. Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit by a firm we hired to confirm this. We’re also working with regulators as they investigate what happened.

This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.

In this case, we already took the most important steps a few years ago in 2014 to prevent bad actors from accessing people’s information in this way. But there’s more we need to do and I’ll outline those steps here:

First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data Kogan misused here as well.

Second, we will restrict developers’ data access even further to prevent other kinds of abuse. For example, we will remove developers’ access to your data if you haven’t used their app in 3 months. We will reduce the data you give an app when you sign in -- to only your name, profile photo, and email address. We’ll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we’ll have more changes to share in the next few days.

Third, we want to make sure you understand which apps you’ve allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you’ve used and an easy way to revoke those apps’ permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.

Beyond the steps we had already taken in 2014, I believe these are the next steps we must take to continue to secure our platform.

I started Facebook, and at the end of the day I’m responsible for what happens on our platform. I’m serious about doing what it takes to protect our community. While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn’t change what happened in the past. We will learn from this experience to secure our platform further and make our community safer for everyone going forward.

I want to thank all of you who continue to believe in our mission and work to build this community together. I know it takes longer to fix all these issues than we’d like, but I promise you we’ll work through this and build a better service over the long term.


分享到:


相關文章: