How Could Facebook Do a Better Job at Controlling Disinformation?

10:21
 
シェア
 

Manage episode 297843899 series 1107025
著作 Craig Peterson の情報はPlayer FM及びコミュニティによって発見されました。著作権は出版社によって所持されます。そして、番組のオーディオは、その出版社のサーバから直接にストリーミングされます。Player FMで購読ボタンをタップし、更新できて、または他のポッドキャストアプリにフィードのURLを貼り付けます。

How Could Facebook Do a Better Job at Controlling Disinformation?

Hello, everybody. Great discussion this morning about Facebook and what is going on with their monitoring and controlling some of the topics. Should they have something in place that really stops false information? How could they do that? And what's their real motivation behind all of this.

With Mr. Christopher Ryan, we also got into how the general services administration has completely messed up. Again, it's authorization, this FedRAMP authorization. Why are our federal agencies using some tools like zoom that have been proven to be very insecure, and they're using them with the blessing of GSA, who says they're secure.

[00:00:50] So here we go.

[00:00:52] Chris Ryan: Craig Peterson is the host of tech talk on news radio 610, 96.7. You can check him out Saturdays and Sundays at 1130. Craig, how are you?. Hey, good morning. Doing great. Appreciate being with us. So one of the topics we did we've delved into today is social media giants and censorship of quote-unquote misinformation is. The white house weighed in on this last week and criticized Facebook as a purveyor of misinformation based upon allowing individuals platforms to push forward their opinions.

[00:01:27] From a media standpoint, we have gatekeepers available and able to squelch misinformation and provide a great environment that can provide the most accurate information possible. Is there any way, shape, or form that a Facebook or a Twitter can? Have any sort of an algorithm or staffing or personnel; this is not even whether you should do it or shouldn't do it.

[00:01:55] Is any way feasible or possible for them to strike down misinformation?

[00:02:03] Craig Peterson: [00:02:03] Yeah. This is a tough one. It's easy enough. You've got, of course, Justin sitting there with the big button, and the Eaton cut us off at any point. But when it comes to Facebook or Twitter or any of these other big places you have, what is it now?

[00:02:20] Billion active users per month, just on Facebook. So what you were asking about is there some algorithm. Obviously, people can't do this, and the answer is I have, yeah, a fascinating article. I'll be talking about it next Saturday. How about IBM's Watson? You probably remember Watson pretty well.

[00:02:39] It was going to rule the world. It won jeopardy. It was absolutely amazing. And it really hasn't done much since then. One of the most advanced AIS we've had, and Microsoft and Google both have them; China's working on artificial intelligence to try and figure it out. But the biggest problem that we have is.

[00:02:59] What is the sentiment there. How can you tell an article is criticizing something legitimately, or if they are endorsing something that might be a bad idea, then who gets to choose? What's a good idea. What's a bad idea. This isn't easy to do

[00:03:16] Chris Ryan: [00:03:16] Chris. And there are no rules either. I said before; I don't think it was appropriate for social media to ban Trump.

[00:03:24] I don't; I think that many individuals push forward misinformation and tr and lack truth and their intent is not good. And they're still allowed to do whatever they wish. So I think that you have to create a. A platform and an infrastructure to determine what the role should be.

[00:03:45]And what, how many times do you get to push things forward that aren't true or et cetera? There are no rules.

[00:03:51] Craig Peterson: [00:03:51] Yeah. I look at this and a profit motive on the part of Facebook, certainly in these others. And that is, they want your watch. The Facebook channel. They want you on that feed all day long.

[00:04:05] So the more that you aren't interested in a topic or position, maybe a position is totally false. Maybe you are flat earth, or the more information they're going to feed you—the how the earth is flat and less about anything else. So what we've ended up with now with these social media sites is something that really polarizes us even more than we were polarized before because we see more and more information that confirms our suspicions and where we're at.

[00:04:36] So I see that as an even bigger problem because we're frankly, Getting isolated. We don't have the editorial in the newspaper that might be left, might be right. It might be the center. As time goes on by these submitted authors, it's showing you what you want to see. And that's a big,

[00:04:56] Chris Ryan: [00:04:56] right. And the Facebook feed has also changed over the years where initially it was just your friends, basically, and their information more and more, the feed has gotten filled.

[00:05:07]Topics that they have determined that you're interested in and sources that may or may not be accurate that present that information. And this has happened to me, and I'm sure it happens to everybody else. Know, You may click on the art, one of those articles, and be interested in it.

[00:05:22] And then the next thing 50% of your feed is similar articles or ads that are associated with things that you have looked up. So the Facebook feed has changed dramatically. Over the years to the point where it was initially, there weren't many ads, and you wonder how they made their money to the point of which now, where there are a lot of ads.

[00:05:43] And they're also continually feeding you information on topics that you're interested in.

[00:05:49] Craig Peterson: [00:05:49] Yeah. And topics that, again, might be correct, might be false. And what Facebook

[00:05:54] Chris Ryan: [00:05:54] is doing you, in fact, giving you information that may not be accurate. So not just your friends, Facebook itself is providing you with information that may be false.

[00:06:04] Craig Peterson: [00:06:04] Remember the days in Facebook, or again earlier on where if you followed someone, if you liked someone, you would see their posts, they would show up in your stream. Now Facebook realizes, oh my goodness, you've got hundreds of friends out there. They're all posting stuff. And Facebook decides what you want to see.

[00:06:26] Many of the celebrities dropped off of Facebook because people that wanted to see their posts were no longer seeing them because. Facebook was moderating it. So we've come full circle under discussion. Facebook is doing moderation, and the moderation they're doing is, again, isolating us and dividing us even more.

[00:06:46] That's a huge problem. What's the solution. I w was this day and age. I'm not exactly sure because they, the algorithms. They could be doing more moderation. Should they be doing it? And, of course, that's the whole discussion around section 320.

[00:07:02] Chris Ryan: Exactly. So let's talk a little bit about zoom; an article has come out about zoom and its cybersecurity issues, and the GSA is blocked.

[00:07:12] Senator Ron Wyden reviewed documents used to approve zoom for government use in government meetings. And that this is not just the federal level. It happens all the time at the local level as well. And there were initial concerns about cybersecurity and zoom, but a lot of those concerns and the topic of conversation about them seem to go away during the course of the pandemic; how safe is zoom, and obviously, you haven't.

[00:07:39] Individuals, whether it's from a non-profit private sector or government perspective, who are discussing compassionate things on zoom. And they assume that no one is listening, but are they?

[00:07:52] Craig Peterson: Yeah, that's a terrible assumption, particularly when it comes to zoom. The big concern here in this particular article that appeared in the tech crunch has to do with the investigation that's done by the fed.

[00:08:06] So we know there's obviously information that needs to be kept secret other classified information, and then there's information that might be damaging. And what's happened here is the general services administration gave zoom their authorizations. FedRAMP authorization saying, yeah, go ahead and use it.

[00:08:25] It's going to be fine. Everything's great. Turned out. Zoom was not encrypting these sessions from end to end. In fact, routing some of our conversations live through Chinese. They're using Chinese programmers to write this stuff. They installed back doors on Macs, just all kinds of incredible, terrible stuff that zoom hadn't been doing.

[00:08:50] And some of it is alleged continuing to do. And yet, somehow, it received this authorization from the GSA. How that. Did that happen? Now, there are some secure ways to speak online. Really WebEx is the only one that is fully authorized, and then you have to have the right version of it. Microsoft teams have some stuff that is also authorized and.

[00:09:19] Truly end to end. Zoom is not to be trusted, but what really concerns me is it looked like the federal government delving into any of these tools to verify whether or not they appear to be safe was terribly flawed. And that's exactly what the Senator has been trying to do and why and how, and they're just not providing the information.

[00:09:43] Chris Ryan: [00:09:43] I have a basic theory. Applies, I think to tech and social media; if something is free, ask why and figure out why is this free? Why are you able to do it for free? What is in it for the company, and then decide whether or not you want to use it? But I think that people are all in, on giving their information, using.

[00:10:06] These things and they don't really understand what the business platform is? How is this being used? And I think that is something that individuals should be cognizant of. Craig has always thanked you so much. Take care. That was Craig Peterson. He is the host of tech talk.

1497 つのエピソード