Twilight Saga (2012) Cast : Then And Now 2018
-------------------------------------------
Mariah Carey Vs. Alton Brown: Whose Sugar Cookies Are Better? - Duration: 5:07.(dramatic music)
- I'm gonna stop chewing while I'm talking.
(dramatic music)
- Doesn't seem like a simple sugar cookie.
(dramatic music)
- We're ready for the cookies.
(soft music)
- All right, in the name of fairness,
I'm gonna be putting on a blindfold today
so that we have a true, blind, taste test.
I hope you are all happy.
(dramatic music)
- So I'll start with number one.
- I'm gonna grab this Christmas tree.
- Buttery, smells like cookies.
- Choose the gingerbread man.
This is pretty cute.
- Let's hear, let's hear the crack.
Oh, not much of a crack.
- I guess we'll start with the head,
that's how you're supposed to do it, right?
I feel kinda bad.
- Pretty standard.
- Very simple, really crunchy.
- Very, very buttery.
I like a lot of butter in my cookies,
but this one is almost a little bit too much,
and it's almost a little bit too dry.
- They're very dry, powdery?
If that's the right word.
- You know what?
It actually reminds me of a fortune cookie.
- It's really good, I actually like how buttery it is.
Once you get over how overwhelming it is.
(dramatic music)
- They also smell buttery.
Not much of a difference.
Crack?
Oop.
(crew laughing)
Slightly more, but also soft.
- Cooked evenly, just like the other cookie.
- Mm, these taste crunchier, right off the bat.
- I think my mother will like this.
- There's vanilla in there,
or there's more of vanilla flare.
- Oo, this one's really good.
- There's tangerine?
I don't know, there's a kick of something.
Hold on.
- They also have a different flavoring to them.
- Yeah this has an extra flavor to it.
I feel like this one's like more basic.
- Raspberry?
- [Crew Woman] Lemon zest.
- Lemon, tangerine, I was close.
- It's very delicate, seems, doesn't seem
like a simple sugar cookie.
This was a little bit simpler.
(dramatic music)
- Number one is the winner for me.
- Two.
- Yeah, the second one just stands out to me,
because there's some random flavor in there.
- This plate, number two.
They just taste a lot more like a classic sugar cookie.
Oh man, well I hope, I love Alton, though.
He was my high school crush for two years.
I hope he's these, right, right?
- [Crew Man] Wrong.
- Oh my god, no, I have to see.
Oh god, these were Mariah's?
She's the cooking queen now too I guess.
- Mariah's family should be very proud,
because this is a good cookie.
- Damn, Mariah's such a diva, is she gonna hate me
for not liking her cookies?
But I love her, I love you Mariah, please forgive me.
(dramatic music)
(camera snapping)
(dramatic music)
-------------------------------------------
Subir la cintura de un jeans poniendo un elástico - Duration: 6:36.Hello
I have here this pants that I the shot is very short
you are already seeing it
and the waist also squeezes me
how am I going to solve this
I'm going to replace the waist that has an elastic waist
See, this
which is a little wider
and so I will have the highest shot, I will adjust more to the waist and clear to be rubber will not squeeze me
The first thing I have to do is remove the waist, then overcast, mount the rubber
I will put some pins on each side
and then, here in the union where the button is
so that it stays open to us, to have ease when putting it on
I'm going to put these brackets
I begin, the first step: I unstitch the waist
Now I'm going to pass an overcast here around the waist
and for those who do not have an overcasting machine, I'm going to tell you a way
to put the rubber without overcasting
we put the waist, the rubber, inside out for this side
the trousers over
we passed him a stitching, over here, finite
and now, we turn it around
and the stitching would stay here, We passed a backstitch also on this side
and so, the stitching would stay inside
you see
it would stay inside and it would not be seen, so we do not need to overcast
but I'm going to overcast it because to make a cowboy the seams are thick and if I double it
I would stay fat
I'm not going to do this system, I'm going to overcast it and then the rubber on top
to the ends of the rubber I have made a hem I have overhung and I have sewn
we found half of the rubber
and we make a signal
and now we are going to assemble it
we started
an extreme
here
right here
we put a pin
and the other extreme
to the other side
another pin
half of the rubber on the back
and we put another pin
we stretch a part
A) Yes
and we put another pin, here, on the side
and we are putting pins holding the rubber around
and now we pass the flat machine and we do a stitching around here by the filito
here in the opening I'm going to put these types of corteche
I'm going to put two
one a little higher and one lower
and these are going here on this side
here I have the pants finished
the zipper
the brackets
It has been very good
we won the shot that was what we want and we do not have to be tight
I am going to comment
the rubber in haberdasheries there are few options
or you find white you find black
at least where I live
find a wide rubber that is what I was looking for to raise the shot
as I have commented
I have had problems
nothing else have I found this
I would have acquired it a little wider
but in the end I found this that combines very well with this cowboy
you see
that is a bit ornate
And this is the result
if you liked the video, give "like"
and do not forget to subscribe
Share in your social networks
-------------------------------------------
Fox News Guest Says People Die EVERYDAY From Pot Overdoses - Duration: 3:50.On Monday, Fox & Friends decided to take a little bit of time to tell us just how dangerous
and actually deadly marijuana can really be.
Now, what sparked this conversation is the fact that in Florida, a 12 year old boy brought
a pack of marijuana gummies to school and distributed it to some of his classmates,
uh, five of them ended up in the hospital because they got very ill.
So obviously according to Fox News and the people at Fox & Friends, this is a huge international
story, a deserving of wall to wall coverage.
People are dying from these marijuana gummies.
So take it away, Fox & Friends.
But no one talks about this.
THC is addicting.
People back the.
I know so many people, they say they were told one thing and they tend to get addicted
to it.
And that's an addicting substance there.
There is a price to pay for Pot.
There absolutely is a price to pay for pot.
You know, I spent my entire adult life in law enforcement and a lot of it investigating
traffickers of drugs and it's not a minor nonviolent felony.
It's ruining families and killing people every day across the United States, and we stand
here in denial thinking that it's not a gateway drug to drugs that's killing people.
You don't start with cocaine.
You probably start with marijuana and it leads to other things, right?
That's absolutely right.
Alright, let's get several things straight here.
First and foremost, it feels like the people at Fox & Friends and that a Polk county sheriff,
we're actually reading, talking points from the early 1980s.
Uh, all of which, by the way have been debunked.
Marijuana is not actually a gateway drug.
And yes, there are plenty of people in this country who do start on things like cocaine.
Uh, contrary to what Ainsley Earhardt said there, but the sheriff, that's what really
drove me crazy here, is he says, people die every day from this, talking about marijuana.
People die every day in this country.
Well, according to the government's own statistics, there have literally been zero cases of people
dying from a marijuana overdose in all of recorded history.
Zero.
None.
Not One, never.
So right off the bat, the sheriff loses all credibility by saying something that is verifiably
untrue.
But here's what really gets me about all of this.
Why is Fox News carrying water for major pharmaceutical companies?
Because that's what these attacks on marijuana are about.
It's coming from Big Pharma.
They're the ones who fund the campaigns in states where they're trying to legalize recreational
or medicinal marijuana.
They've pumped millions upon millions of dollars into these efforts.
They pay off politicians in DC to stop any kind of marijuana research, medical research,
whatever it is.
They don't want it to happen because they don't want people to understand that there
may be a low cost and no side effect alternative to their blockbuster pills.
So why is Fox News doing their job for them?
Well, I think the answer simple, advertising.
Some of the biggest advertisers on all cable news channels, and even just in this country
in general, our major pharmaceutical companies.
So when you see Fox & Friends out there telling us that kids are dying from marijuana gummies,
just understand that those are talking points that are coming directly from the major pharmaceutical
companies who are terrified of losing any part of their profits to medical marijuana,
which is exactly what would happen and what is happening in areas all over the country
where marijuana has become legalized.
-------------------------------------------
Al Gore Shares His Memories Of George H.W. Bush | TODAY - Duration: 5:51. For more infomation >> Al Gore Shares His Memories Of George H.W. Bush | TODAY - Duration: 5:51.-------------------------------------------
Pawn Stars: Mogen David Advertisement Slides (Season 12) | History - Duration: 2:36. For more infomation >> Pawn Stars: Mogen David Advertisement Slides (Season 12) | History - Duration: 2:36.-------------------------------------------
Android Developer Story: Fanatee explores the subscription model - Duration: 3:02.Fanatee is a studio we started in 2013 introducing games with content,
especially word games, which we like a lot.
The most downloaded Fanatee game is STOP;
it has more than 10 million downloads on Android.
But today we have CodyCross, a game that was released in March of last year,
which we have grown considerably
and today already has more active users than STOP.
Android is growing a lot for us.
Today we get 75% of our downloads from Android.
We live in an Google play's environment where distribution is done for the entire world,
and it's therefore important that you develop apps that can be used by anyone.
In the case of the Fanatee games, location is extremely important.
The game has a local database, local content.
We practically make a game for each market.
We receive reviews from users in Canada who think our app was developed
there by local developers.
Because some questions were so close to their reality.
People don't think that a company in Brazil
can create quality content that is relatable to such a different country as Canada.
We released CodyCross in March of last year and in July we tested the subscription model.
We chose Germany and the United States for this test,
as these countries have a higher app purchasing rate.
Our subscription model gives you access to additional content,
this is its main benefit without watching ads.
And our in-app purchases are hints and power-ups in the game,
which means ultimately there is no cannibalization.
Our subscription revenue for CodyCross currently represents
between 15 and 20% of our total revenue in the countries where it is on offer.
We use some interesting tools like free trial, which is an important tool.
Someone can test the app for 7 days,
after which around 50% of people convert into subscribers.
Currently, around 80% of our subscribers began with a free trial.
CodyCross subscribers spend 50% more time in the game
and our retention of subscribers is up to 70% higher than other players.
Subscription brings in more predictable revenues
and this is very attractive for the entrepreneur, for the developer.
And it also gives you support as a revenue.
The main focus is to find an interesting
offer for the user that adds value and has a
cadence of content so the person remains motivated.
In addition, always communicate and release new content
so the user once subscribed, realizes that value is being created there.
The developer is concerned with increasing the amount of content or offerings.
-------------------------------------------
"Priyanka Chopra" & "Nick Jonas" Delhi Reception Live Update - Duration: 3:13."Priyanka Chopra" & "Nick Jonas" Delhi Reception Live Update
-------------------------------------------
The Facebook Dilemma: Wael Ghonim - Duration: 1:33:11.When was it that you started to see social media as a way to maybe just kind of express something?
What were your early observations that you remember having about what social media was like back in 2011 or so if you can bring me back to where you were and what you were thinking about in terms of the opportunity for social change there?
When I went online for the first time, I was fascinated by the ability to connect with people beyond my own social reach.
And that was before the social media era.
When social media arrived, it was far more sophisticated [in] that you could actually engage with thousands, tens of thousands, hundreds of thousands of people without really having them be in your social reach, and without you having to acquire the traditional status in the society to be able to do that.
For example, you don't have to be a media celebrity or you don't have to be a well-known politician to be able to reach the masses.
And not only that.
It was not just the fact that people could reach out to others in their—beyond their social reach, but also the ability to engage with them in back-and-forth and one-to-many, many-to-many conversations.
Also one great aspect was the discoverability of like-minded people is much easier.
There are these groups online where, whatever it is, for whatever product that's out there, there will be a group of people who believe in that product and form a mini-tribe, which goes back to our instinct to want to connect with like-minded people, and that created a lot of interesting dynamics from where it came from.
I came from Egypt.
There's a government that's been ruling us for over 30 years.
It's a corrupt government.
There is no democracy; it's staged.
And there is emergency law that would get you arrested if you dare to protest.
If more than three people end up in the street without the blessings of the government, protesting, they could get arrested.
So the idea of organizing online, the idea of building common knowledge among people who are like-minded, who want to see Egypt in a different place than it is, who are mostly middle-class, young, ambitious, naive, wanted to see a better future for our country, and social media, mainly Facebook, provided that.
The first time I thought there was something big that's happening was when Mohamed ElBaradei, a Nobel Prize winner, announced that he's coming back to Egypt and he's interested in pursuing the potential of running for [the] presidency.
That was kind of like a great—it gave us great hope.
We thought—and when I say "we," [it] is people who wanted to see Egypt in a different place than it is at the time.
And I looked online, there was a group that say Mohamed ElBaradei for Presidency, and it has 200,000 people.
And I was like, "Oh, my God, that's actually a lot of people," knowing that there are only like 4 million people at the time on Facebook—Egyptians, 4 million Egyptians on Facebook.
And I started looking.
I remember clicking—at the time you could actually see the members of the large group.
So, I started looking at the people, and I can see some friends, or friends of friends, people I've met during my school time, professionals.
It just felt like: "Oh, OK, I'm not by myself.
I'm not the only crazy one here hoping for change in Egypt."
And what I thought was, this new power is going to change the dynamics of politics in Egypt.
I was extremely hopeful about it, and the main reason I thought [that] at the time is it just basically, it empowers everyone; it empowers the masses; it gives them tools of mass communication that never existed before in the hands of the masses.
I thought at the time that this is something that dictators will not be able to deal with.
And also [it's] one thing about dictators; it's another thing in a media atmosphere.
I'd imagine in Egypt, where only certain people get a voice, right?—
Describe what that was like at the time.
Yeah, it's basically everything.
In any dictatorship, the flow of information is controlled.
What social media offered is a new way where dictators are not yet aware of, that enables anyone to propagate their ideas online, and before the government realizes, the idea spreads and becomes more mainstream.
At the time, that was a whole new phenomenon that such a dictatorship regime had not fully absorbed or understood.
And that's definitely one of the superpowers of social media, which is the control of the flow of information is no longer under the scrutiny of whomever is in power that decides what should be seen and what should not be seen.
Bring me through your realization of this as such a powerful tool for liberating the society and how that led to action for you.
Yeah, I think it's all about rebalancing the powers.
I do think that dictatorship is all about creating a way for old power to operate under the hands of very few number of people, whomever they are and how close they are to the ruling regime.
What social media brought at the time was a whole notion of rebalancing power.
A lot of us who felt like outsiders, we are influenced by this political system, but we don't have a voice into how this political system operates.
And whenever one of us dares to have a voice, they get directly punished.
What social media offered was that it reduced the price of engagement and increased the price of control for—I wouldn't even say increased the price.
At the time, it was extremely hard for them to control.
They don't know what to do.
Are they going to block Facebook?
Are they going to try and infiltrate it?
It's probably not a priority at the time.
They're not seeing it.
So it just created this environment—fluid environment where more and more people were getting engaged into the political game, and more and more people wanted change, and more and more people started getting into groups and feeling more excited about a future that looks different than the present.
You're recognizing this at the time — you had a background in computers, and you'd worked for Google at that point, right?
You're recognizing this at the time.
But bring me through how it is that you yourself wanted to use it as a tool.
I just realized from the early days of the internet that—I was lucky enough to build numerous websites that had hundreds of thousands of people using them.
I have also worked and operated in tech firms since my early days, even during my college days, and I'm a computer engineer by training.
I have continuously witnessed the new power and what it offers to people, and I was an avid believer in it.
And I also—you know, I had a very diverse background.
I lived in Saudi Arabia.
I went to Egypt when I was 13.
I went to a public school, which is not like a public school in the U.S.
It's a completely different—I always joke to my Egyptian friends, saying some of the best private schools in Egypt are not on a par with public schools in areas like California.
So I went to a public school, had a lot of experience dealing with kids from poor neighborhoods who cannot afford what I would think is the basics.
I traveled to the U.S. when I was 20; I got married to my wife when I was 21.
So I've had a life where there was a lot going on, and one thing I have learned is to adapt to environments and try and appeal to and try to question whatever environment I'm in, whatever system I'm in.
I found whatever that was happening in Egypt was very hard.
I felt very bad for the poor.
I felt very bad for my friends, like one of my friends during college time got arrested from the street and tortured for a few months before being released.
Seeing all that going on made me feel like there is something I need to do.
I didn't know what is it that I need to do, but I'm not going to live my life just watching from afar.
I've got to engage.
The internet provided a bit of a safe space for me to engage because I engaged anonymously.
I was too paranoid to even join any public protest.
I was too paranoid to engage with activists, because I know the government is monitoring everything, and whenever, as I said, they find someone posing any threat to their power, they punish them severely.
It's either by character assassination, putting you in jail; in some cases you could get also killed.
There was a huge price to engaging directly, and the price seemed much, much less when it comes to engaging online, especially because you could engage anonymously, and that's how I found my role.
I wanted to do something.
The way I see it in retrospect is activism is all about reducing the price of citizen engagement, because you want everyone to have an agency and you want them to exercise their agency, while at the same time increasing the price of oppression.
You want the regime to find it harder and harder to oppress people, and if you're doing one of the two, you are doing correct activism.
If you're not doing one of the two, then you're just doing something else.
To me, my role was to tell more people who are in the middle class, who are like me, who have not been so exposed to my experiences—because I went to public school, I engaged with people from poorer backgrounds and— I have not experienced their life, but I kind of know from afar what they're exposed to.
My job was to try and tell them what's going on, and to also tell myself and tell them that there is hope and that we've got to work on something.
So what was the catalyst for creating the Facebook page?
… The page was named We are all Khaled Said, because in a dictatorship or in a system where force is used to manipulate people, the people who exercise that force are never accountable to their actions, because if you start making them accountable, from a regime perspective they could eventually stop acting.
What [happened] was Khaled Said was killed, and I did feel, I could be this guy.
I could be walking in the street, getting beaten up by police officers and dying, and nothing happens to the police officers, and if you really want to stop this, you have to increase the price of oppression.
You have to tell them that it's enough and that we want justice.
We want our freedom; we want our dignity.
And I really think that a lot of what we did had to do a lot with dignity.
When you are stripped out of your right to speak up, when you are stripped out of your right to be part of the system, to have a voice in the system that governs you and impacts you directly in your day-to-day life, you're stripped out of your dignity.
It was a call for dignity that I wanted to voice as an individual that happens to be resonating with hundreds of thousands of others who thought that this voice is a good representation of what they also feel.
Were you there in the street that day?
Or you—this was something that was—
No, no, that was online.
Online, OK.
Help bring me into the nitty-gritty of that story of your deciding to create the page, where were you and what was—bring me into your kind of, how that—was it just in the course of one night that you decided to do that?
In June 10, [2010], I was in my study room in Dubai.
I was at the time working for Google in the Middle East office.
I spent a lot of time in Egypt, so I was still very connected to what's happening in Egypt, and I was hoping for Egypt to see a change in 2011, which was the presidential elections.
I saw on my feed a photo of this guy's face, like someone that seemed to have been tortured.
And it just mentioned his name and a photo of him, just like a portrait photo of him when he was alive.
I just could not believe the cruelty and the pain in the photo.
When I see something that doesn't look right, I just question it, and I was like, well, there must be something missing here.
Then I was reading, and it's two police officers that were running after him, want to catch him, and he ended up being beaten up, and he died.
His brother leaked the photo, because normally when such a thing happens, the government does its best to hide everything, but he managed to snatch a photo of his brother's body and sent it to political activists.
When I saw the photo, I was devastated.
I started—I remember I was crying in my study room because I just felt like this has got to stop.
You can't just keep going in that way and think that Egypt is going to progress as a country, and I do feel responsible to put time and effort toward this.
And I just decided to create that Facebook page.
Not that I was thinking that this Facebook page was going to be effective.
I just did it out of like, "OK, that's what I could do.
I could be anonymous; I could create a page, and I will try in this page to focus on one thing."
I was not like trying to create a movement or anything.
I just wanted to call for bringing the officers who did this to justice because what I thought at the time was if you increase the price of oppression, it's likely you could save lives.
When I created the page, the name was We are all Khaled Said.
The idea behind the name was that I did feel like I could be him because, as I mentioned, I've witnessed the state security in Egypt kidnapping and arresting people for months torturing them.
And I knew that just the fact that I'm getting engaged in activism, I could actually be also arrested, killed or kidnapped or all these things could happen to me, and my job is to actually resist.
I don't want to just watch from afar.
I want to engage, and the page was a great way to engage.
A few hundred thousand people joined the page.
In the first the first three days, actually 100,000 people liked that page.
And it was—I had complex feelings at the time.
One is like, "Oh, my God, this is actually getting serious."
And I have told a few activists about the page early because I did not think it was going to be effective or it's not going to be that big of a deal.
But when it went to 100,000, I started freaking out about, "Did I just put myself at risk, because if it spreads that I'm the admin of this page, I could actually get hurt, if I end up organizing events and stuff?"
And through this page, I tried to, with another co-admin [who] I happen to know from Facebook, and we've never met online at the time—his name was Abdel Rahman [Mansour], and I told him, "Do you want to help out?"
I liked his ideas and his writings.
I was like, "You want to help out with me managing the page, since it's going to look like—it's going to be a full-time job to create a flow of information, create common knowledge, come up with creative ideas on how could we facilitate dialogue with the government?," because I don't believe in just opposition for the sake of opposition.
My goal is to reduce oppression, and if that happens by collaborating with people from within the system, let that be. ...
Of course, throughout the next few months the page was growing until what happened in Tunisia.
Ben Ali—as you probably remember, what was called the Arab Spring started with massive protests in Tunisia, following a man who set [himself] on fire because of the economical conditions, and that led to [Zine El Abidine] Ben Ali fleeing [from] Tunisia.
This is the first time I've seen an Arab dictator try to apologize, and then when people say no, they have to escape.
It just created, for me, a moment of "Maybe we can do this.
Maybe we could actually revolt against our own government and get them out of our way," because for democracy to happen in Egypt, those people are never going to bring it.
Those people do not believe in democracy.
They have been running the country for 30 years, and they don't want others to share power with them.
So it was inevitable.
And I just posted an event calling for a revolution in 10 days, like, "We should all get to the street, and we should all bring down [Hosni] Mubarak."
That was actually—I was just a sound of what the people were saying, because a lot of people were saying that.
What I did was just basically put the invitation on a page that has hundreds of hundreds of thousands of followers, and that ended up—the event on Facebook reached a million people and it was confirmed by 100,000.
We took to the streets, and it just basically was one of those—it showed to me the great power of social media.
I remember a lot of interviewers, early when I did anonymous interviews, that told me: "Oh, yes, social media, it is overrated.
It's slacktivism.
People are just venting their anger, but nothing is going to happen.
Nothing concrete is going to happen on the ground."
I did not believe in that.
I believed that those people are not zombies; they are real people, and granted the right time and the right environment, the right group, the right call, they will act.
And that's what happened on 25th of January.
Tens of thousands of people, men and women, a lot of them are young, broke the barrier of fear.
My best memory on the day was a young man who stood, who stopped a police tank by just standing up in front of it, and he could have, might as well [have] just died at the time if the police then decided to—but the police tank had to stop.
For me that was an announcement of a new era, a new relationship between the government and the people.
And that was kind of the moment where I felt that online energy could actually be converted into an on-the-street action.
And what was it that prompted you to actually make an event on the page?
Was it because of what was happening in Tunisia? What prompted it?
It was mainly the belief that there is no hope—there is no hope in believing that people who got us to where we are at the time are the ones who are going to get us out of that.
I did not believe in the insider's ability to turn things around.
I got a lot of emotional hope, and I say emotional because it was very emotional when I saw what happen[ed] with Ben Ali.
And there were videos.
That flow of information was actually amazing at the time.
And if you—if that would have happened 20, 30 years ago.
Most of the government, the mainstream media is going to portray a certain image of how Tunisia is bad and whatever happened there is bad and how Egypt is stable.
Actually, the mainstream media tried to do that, but it didn't work, because they are not in control of all the media anymore.
There is social media that is [not controlled] by them.
They're controlled by algorithms developed in Silicon Valley that they have no way to infiltrate at the time.
Before the event, what was it like for you to have seen the ranks of people on the page grow to hundreds of thousands?
Was this just completely by surprise to you?
What was your thinking at the time?
It is—I wouldn't call it a surprise, but it was both a responsibility and an opportunity.
I like to believe that I'm responsible as much as I can.
For example, I told my co-admin, AbdelRahman, we don't want to call for any protests.
Most of these people who joined the page are apolitical, and we don't want to expose them to risks, because the government would arrest them.
We don't want to get them to do things that they don't understand the consequences of.
Little do I know that [a] few months later, I'm going to put in an invitation for a revolution.
But it was more I took that as a responsibility.
I tried to be as responsible as I can, as honest as I can.
It was anonymous, so there was an alter ego, which I now say it's like the best version of me.
I'm not as good as that alter ego, but I would strive to be the alter ego.
And it was more of a responsibility.
At some point it was too much as well.
I did not enjoy the fact that you are responsible.
Sometimes we have done activities, and every time there is a group activity, I'm very concerned about the safety of everyone.
And those are people I don't know, I don't have access to—if something bad happens to them, and I don't even know how to help them if something bad happens to them.
So I wanted to be as responsible as I can.
And that's why I was not super-happy and excited by the fact–Those are not numbers.
That's one thing that I keep coming back to.
I also think about that, and I tell that to my fellow engineers and product managers: "You know these metrics in your dashboards, they are real humans.
They are people who could be at risk, who could also benefit tremendously from your technology.
They could prosper.
They could be killed.
All these things could happen to them."
So these are not like, when we say growth 10 percent, what does this 10 percent mean?
So I was not really—I was looking at the metrics definitely because I want to make sure that there is more, the movement is growing or the idea is propagating, instead of saying the movement—because it was not a movement in its traditional meaning of a movement.
The idea is propagating.
But I was also trying to be as responsible as I can.
And there was no playbook at that point in time, right?
I mean, in terms of creating a social movement or propagating ideas, this was relatively new?
I also don't believe in playbooks.
I think you could learn from other people's experiences and past; you can relate to other [people's] past experiences.
But there is no such thing as playbook.
Every society has its own culture, has its own norms.
Every group of people have their own ambitions.
So, I don't believe that much in playbooks.
What was fascinating about social media though is you can get instant feedback.
You could get to understand how and tweak your message accordingly.
It's a double-edged sword, though, because as you know, the short-term emotions are not necessarily always right, and that's actually pretty dangerous, because you could optimize for getting short-term rewards [at] the expense of long-term value.
And we'll talk about that.
That's actually one of the big problems within social media companies that happens.
Also, that's a problem within social movements right now.
That's a problem with politicians, because somehow, if we are optimizing for what's short-term versus what's long-term, and most of what's optimized for in short-term is very emotional and less rational.
We'll definitely get to that, because that's an important realization, I guess, that you come to a little bit later. Right.
I mean, with the demonstrations.
Everything breaks loose at that point in time.
Yeah. People broke the fear. People broke the fear.
The authoritarian regime wanted people scared, because if they stop being scared and they speak up, that becomes very dangerous for the survival of the regime or for the power they're holding.
Basically, the protest was breaking the barrier of fear.
I was snatched out of the street after two days of the protest, and I was blindfolded and handcuffed for 11 days.
I don't know where I went to.
I was interrogated by what I assume to be state security officers or intelligence officers.
I have no idea.
During the time, the protest went into a roller coaster.
One day, hundreds of thousands of people show up, and then an emotional speech by the former president [Hosni] Mubarak, then pressure from families asking the youngsters to go back home and, "Everything will be all right, and he's not going to run again for presidency, and his son is not going to run again.
His son is not going to take over as people were anticipating."
I was released after 11 days.
I came out to see a completely different Egypt than the one I left.
I was taken—I saw complete different Egypt than the one before I was arrested.
In fact, in the square, it was more of like Muslims, Christians, liberals, leftists, everybody's collaborating....
There was a lot of euphoria in the square when I was released from prison.
My experience before 25th of January is like, it's hard to get hundreds of people in the street to resist the regime, and I went to the square, and there was barely a place for someone to move.
People were in hundreds of thousands.
They were full of energy; they wanted change.
Everyone wanted change for a different reason, but they all wanted change, and they were all hopeful.
I saw an Egypt that I want to live in.
Everybody is getting together, men and women, Christians and Muslims and nonreligious people, leftists, liberals.
There was kind of a euphoria within and energy within the square that just made me naively believe that that's an Egypt that we could actually achieve as soon as Mubarak is out.
A few days later, Mubarak was forced out of power because the protests started getting to even his own area in Heliopolis, and the army decided to ask him to leave power.
It was an unprecedented moment.
It created a lot of hope, created also [an] unrealistic sense of [worth] for people like me who were involved in the protests.
But sadly enough, we were humbled by the consequences, the failures that we've experienced.
And I say sadly.
It's good to be humbled, so that's not the sad part.
The sad part is that things turned out to be much, went into a much worse direction than what we have anticipated.
The hardest part for me was seeing the tool that brought us together tearing us apart.
Before 2011, I always thought of the internet as a liberating tool.
How can a dictator control this?
It's uncontrollable.
And after this, I removed the word "liberating," and it's just a tool that will be easily used by anyone because it's just a tool.
The fact is the economics of using it for the bad is much more rewarding than the economics of using it for the good.
Take fake news, for example.
It is much cheaper to create fake news, and it's much more engaging to create fake news, and therefore fake news could easily spread across the internet, while to verify news, it's much more expensive.
Economically you have to hire professional journalists who are going to spend time verifying the information, and then, at the end of the day also, it's less engaging, so it gets much less distribution.
Realizing, now as I'm saying it, it's like I was extremely naive in a way I don't like actually now.
I don't like admitting that.
Wow, how naive I was thinking that these are liberating tools.
And I think part of that naiveté was falling for drinking our own Kool-Aid as engineers and as people who work in building these technologies.
Everybody likes to think that they're—like, you probably want to think that your job is the most productive, the most useful for the world.
Otherwise you're going to find…You are not going to be able to wake up in the morning excited about your job.
And that's what we did.
But the only, I mean not the only, one of the critical problems that [has] happened as we build these products is that we did not take the serious consequences, both intended and unintended, as serious[ly] as they should be taken.
These tools were built for connecting people, and people did connect, but anybody could manipulate others using these tools.
Anybody could hack into this system, into this game, and get the attention that they want, and direct people's attention toward that.
It's kind of like it becomes [a tool] of mass propaganda, and it becomes much harder to deal with because they are much less obvious.
In the past, you know, "OK, government X created a mainstream media channel; it's financing it to create propaganda and in country Y."
You already know these are facts.
But now it's kind of like all these anonymous players, just like how I was anonymous—I use my anonymity for good.
But there are others—or what I perceive as good, I'm not making a clean judgment call on behalf of others—what I perceive as good—but there are others who are going to use it in what I would perceive as bad and dangerous.
And that raises all these questions about the technologies that we have built and how we rushed into using them and building them without thinking much about the consequences.
There was a moment that you were being interviewed on CNN at the time, and you said that you really one day wanted to go meet and thank Mark Zuckerberg and Facebook.
Do you remember that?
Yeah, I remember that.
Tell me about that.
Tell me about that moment and what you were thinking in that interview.
It was just everything that was happening was surreal.
In just 25 days, I started a call for a revolution.
I traveled from Dubai to Egypt to witness one of my very first and biggest protests in my life.
I got snatched out of the street, got kidnapped and tortured and released in 11 days.
I went outside to become, all of a sudden, a recognized face and got a lot of media attention.
And then a couple of days later, [a] 30-yeardictator was forced out of power.
It was surreal, to the extent that I started building all these romanticized views of the world of politics, of everything, and then I was also very appreciative of the technology, because the technology was for me the enabler.
I would not have been able to engage with others, I would not have been able to create the page, I would not have been able to propagate my ideas to others without social media, without Facebook.
That's why for me—I was very grateful for whatever Mark and his team built, and this is why I wanted to meet him.
I actually met him as well.
At the time I did see him when I visited the U.S., and it was a nice meeting.
In it, I did feel that Mark somehow had mixed feelings about this, like what he built was not to be used in politics.
I think his heart was in the right place, because he thought this is more of a social product where people engage.
He didn't see it as a place where people spark revolutions.
He did not create a strong viewpoint on where should Facebook go.
Actually, I remember it was funny at the time, because I asked to take a photo with him, and I don't put online photos with people I meet.
I find that—I don't do that.
Let me not judge others who do it; I just don't do that.
But I think Mark was not comfortable with the idea, and he said, "Oh, we'll take the photo and mail it to you."
And I said, "Sure."
I knew they were never going to mail it to me, and he never did.
And why do you think he was uncomfortable with that?
I just think he did not want his product to be perceived as an active political player.
And that's something I actually told him and others in the meeting: "I'm very grateful for this product, but I also am happy that you are not an active player in the game because you should not be.
You're just providing a technology tool, and you should not be part of that game."
Actually at the time, a lot of people were criticizing Facebook because they shut down our page because I was using a fake account to operate the page with.
I also had mixed feelings about that.
On one hand, yes, I did not want my identity to be known by Facebook, that I'm the one running the page.
On the other hand, I understand what they are doing.
They want to be protective … because someone could just—someone could create an anonymous page and harm others, impact the life of others in a very negative way. …
When you look back at your own naiveté, what were the sorts of things that started to break that for you, that you started to see happen specifically where social media took on a less optimistic light for you?
It was mainly how the algorithm works.
It was mainly how—and I talked—
Bring me through [that].
What were you recognizing about the algorithm?
The best thing in Facebook was actually the News Feed at the time when I was managing the page, because the fact that most of the people are actually not going to proactively go to the page every day to know what is the page up to, it just reduces the audience significantly.
The most amazing part of Facebook was the fact that everybody goes to a News Feed that is personalized to them, and they could see content, and if they happen to engage with the page, they are going to see more of the page, and that does give us enormous power.
Those algorithms that operate these News Feeds make billions of editorial decisions a day, because there's probably tens of millions, hundreds of millions of pieces of content that are created every day, and the algorithm makes the call on who should see what.
You can be clever in trying to make your message more engaging, so that when you post on Facebook, people like and share and comment, so more people end up seeing it because the algorithm is going to say, "Oh, OK, that's engaging content; people like it; show it to more people."
Sounds good until someone decides, "OK, then how can I optimize for increasing my audience?"
Unfortunately, one of the things I learned later, which I think was a very precious learning, was when I read Daniel Kahneman's book Thinking, Fast and Slow, and he talks about humans have two systems, one that's very emotional, irrational, that's very spontaneous, takes actions pretty quickly; and there is another, system two that's more analytical, takes time.
Most of the time we hire system one, because system one is less brain-taxing.
You know, if we hear sudden noise right now, we're just going to look at where the noise is, and we're not going to process much before that because it's more attuned to our survival.
System two is—for example, sophisticated, complicated mathematical problems or situations that has long-term implications on us, we tend to use system two.
What social media does is it basically puts system one on steroids.
It just gets you very emotional.
It basically hacks into your human emotional self.
I'm not saying that, by the way, as if there was a master plan.
That's one thing I don't like in the current movement of trying to criticize Facebook, which is it makes it seem that there's a master plan to hack into people's brains and control what they—it's much more simpler than that.
Those are basically product managers and engineers who really wanted to do good and build good products that get used by billions of people.
I mean, at the time, probably they're not even thinking about millions, but they now probably think in billions of people, and they just want to feel good about what they're building.
They're not really into hacking anyone or manipulating them or getting them addicted.
These are all, in my view, unintended consequences.
And it's very hard to be in their position.
It's extremely hard to be in their position.
I actually empathize a lot with people at Facebook, and I think they're into a very hard problem, because they're operating a very complicated economical system that no matter how they try and move and change things, there will be always unintended consequences.
And the most harmed part of society that loses in whatever change they do is going to be the one that's mostly speaking up.
So back to the question, I think the big part of naiveté here is inability to understand that these tools are just enablers for whomever.
They don't separate between what's good and bad; they just look at engagement metrics.
I can't remember who said this, he was saying: Journalism, good journalism, covers stories that are true and hopefully engaging.
Social media covers stories that are engaging and hopefully true, [EDITOR'S NOTE: The saying comes from a Nieman Reports article published in Oct. 2017.] and I think that's a fundamental difference.
When did you start to question the engagement optimization system?
So you have this page; you obviously engaged a lot of people; it helped to spark something at a particular moment in time.
But when did your thinking start to change?
What was it that you were observing at the time?
It was the spread of misinformation, fake news in Egypt in 2011.
I was called- I was under a lot of smear campaigns.
… My favorite one was when I was called a Freemason.
I did not even know what the word means at the time.
I had to Google it.
Basically, conspiracy theories are very viral in our part of the world, and [an] engagement-driven algorithm does not care about if Wael is a Freemason or not or if the source is reliable or not.
If more people are liking and commenting and sharing the fact that Wael is a Freemason well, let the algorithm tell everybody, "Hey"; you know, announce it.
And of course, when it hits personal, you are more emotional about it, and it was just not about my own example.
I was just seeing this environment that if you increase the tone of your posts and become more exclusionary against your opponents, you are going to get more distribution because we tend to be more tribal.
So if I call my opponents names, my tribe is happy and celebrating.
Yes, do it; like, comment, share.
Let everybody hear about it.
If I'm trying to be objective, if I'm trying to actually humanize my opponents, barely I'll get distribution.
In most of the cases I will not get distribution; I'll not get rewarded by the system.
So I started thinking.
I do think that social media is an economic system.
It has all aspects of economic systems.
People are trying to aggregate wealth — wealth in this case is power demonstrated by others following you and listening to what you have to say and your ability to propagate your ideas.
The more you have an audience, the more likely you are going to be celebrated.
And in this economic system, the norms are designed by a very simple rule.
Rewards are superpowers.
If you reward a certain behavior, you'll see more of it.
So if you tell me if I'm more polarizing, if I'm more sensational, if I attack my opponents, if I insult them, I'm going to get more distributions, a lot of the people are just going to do that, and the bad content is going to outrank the good content, and it becomes very challenging for people to create good content, because you know that there is no audience.
And then here comes the argument—because I've had a lot of arguments with engineers who work in ranking systems, and I've personally worked very close to engineers who work in ranking systems—about, "Well, isn't this what people want?"
No, it's not what people want.
It could be what system one people people want[s] because it's emotional' it's tribal.
But generally speaking, if you take two people and tell them that, "Consequences of this extreme polarization [are] going to be toxic in the society; it's going to actually get us into a point where we're all going to lose.
Do you want this extreme polarization?," most likely they'll say, "No, we don't."
The idea here is that these economic systems were built with, in my view, a lot of naiveté, because they were meant for cat videos, baby photos, "I'm getting married," "Here's the food I'm eating."
Yeah, I mean, the most engaging [posts] get most of distribution does make sense in this because there is no consequence.
There's not much consequences on the society based on these decisions.
But once this gets into issues such as race, gender, social unrest, it becomes very dangerous, because what you are doing is you are increasing the gap; you're widening the polarization; you're rewarding sensationalists and big voices.
That's why, in my view, the rise of Donald Trump with his certain behavior on social media is some sort of a hack into the system, because someone just [knew] how to create content that gets everybody to consume, and he just used that with brilliance.
But how early was it for you that you started to recognize what the system really was, that there were perverse incentives inside of this system?
It's 2011.
It's the few months following the revolution, because the political environment was already toxic on the ground.
That's something also I try to talk about.
Facebook does impact our behavior, but it's also a mirror of our behavior, right, like on the ground.
It can't be that if the society is in peace, Facebook is going to come in and disrupt it, because if the society is in peace, most likely they're going to use the tool to amplify the peace.
What was happening in Egypt was polarization.
A lot of the powers that were together in the square not wanting Mubarak did not know what they want, or they discovered eventually that what they want is contradicting.
An Islamist wants an Egypt that is different than what a liberal wants, and all these voices started to clash, and the environment on social media [bred] that kind of clash, that polarization, rewarded it and made it actually less and less appealing to try and build consensus.
I was seeing that in a lot of grief and disappointment and actually started pointing outto that early on from 2012, because I thought these tools could become actually very destructive in the hands of wrong people, or not wrong people, by rewarding the bad behavior, because I don't think there are right people and wrong people.
Just like if you reward the bad behavior, you are going to get more of it, and if this bad behavior is toxic for the society, that's pretty dangerous.
It's going to take us into points where there are huge consequences that could have been avoided.
It also just seems what you're describing is that it's not neutral.
You're recognizing at that point that it's not necessarily —an algorithm makes decisions that are not—
It's editorial.
Right, so I think what I'm what I'm trying to understand is that, are you recognizing also that the middle ground or the middle politically or socially or whatever doesn't have a competitive advantage in this?
Help me understand what you're observing there.
… One of the challenges of engagement-driven algorithms—and it has to do with human nature—is that rational voices that [try] to bridge the gaps and bring people together lose the podium.
They don't get the distribution.
They don't get the engagement, because what they are posting is mostly not tribal; it's mostly not polarizing; it's mostly not sensational.
And that's not what most of the people who engage—not most of the people in general but most of the people who engage with content—want.
You're likely to engage with content that confirms your beliefs, and the more it confirms your beliefs, the more likely you are going to engage with it.
And if you are a part of a minority or even part of a majority that wants to feel good about yourself being in that group, you want to hear the most polarizing views about the other group because you want to justify to yourself why you are in position X and not in position Y.
I think that's one of the challenges.
Taming down sensationalism is pretty critical.
People talk about fake news; people talk about impact on political campaigns.
I think these are important issues.
I'm not underestimating them by any means.
But I think there are far more foundational, structural problems that need to be addressed, which has to do with these platforms that reward polarization, these platforms reward sensationalism, and that [has] got to stop.
Or in other terms, they've got to be tamed down.
You have to create a view that's not partisan.
I'm not saying do this to the right or the left.
You just have to figure out how to detect sensational content and how to ensure that people who create that content are aware [of] the fact that the more they are sensational, the less likely they're going to be heard within these platforms, and that will eventually change their behavior and get them to be less sensational.
And if you tame down these discussions from being my ego versus your ego into the issue that we are having a conflict around that could be good.
You can't bring it down to—we're not robots.
We're still going to have emotions involved in every situation we're in.
But I don't see this as a binary decision whether we are sensational or not.
But we can be less sensational.
We could be less sensational by not rewarding sensationalism.
And I think that's one of the biggest challenges facing Facebook, because it's not easy to do that.
It's not easy to detect sensational content at scale in different languages.
It's not easy to do it because also, once they start doing that, many of these very opinionated people are going to say, "Facebook is censoring our content," and they're going to go out and complain about that.
That's why I think—That's the point.
Facebook, in my view, need[s] to make a decision about which direction it's going to and stick with that consistently.
They're not going to be able to please everyone.
There will be no pleasing of everyone.
Now as we know that there are long-term consequences to decisions taken by product managers and engineers and designers and data scientists sitting in their nice office in Facebook headquarters that could be impacting civil war in Myanmar—now as we know that, we have seen that, we should be much more careful about bringing these changes.
So, you know, breaking things fast is kind of like, you build them and break things fast, ["Move Fast and Break Things] it's the culture, but we should probably slow down.
We should probably think more deeply about the changes we are building into our products, because was it Winston Churchill that said that "We shape our buildings; and [thereafter] our buildings shape us?"
So we shape our tools, and then our tools shape us.
But your concerns at the time about what really was going on.
I tried to talk to people who are in Silicon Valley, because I was connected to a large group of people, but I feel like it was not being heard.
It was not being heard.
Even in 2015, it was not being heard.
It was underestimated, I think.
Tell me then what were you trying to express to people in Silicon Valley.
It's very serious.
Whatever that you are building has massive, serious, unintended consequences on the lives of people on this planet, and you are not investing enough in trying to make sure that what you are building does not go in the wrong way or go to the wrong people doing the wrong things.
My general feeling was that I was not being taken seriously, which is OK.
Most of these people are firefighting.
They're dealing with problems.
They are under the stress of their public companies, reporting quarterly reports.
They have to show growth, and investors are buying the stock in anticipation of growth in the next few months.
I entertained this thought of like—There's an expression that I got exposed to recently, which is "casino capitalism."
It's almost we're playing a game [and] everyone wants quick outcomes, because otherwise can you imagine companies saying, "OK, wait a moment.
Let's just stop our plans to grow and try and be a better force for the society"?
Sounds great, but that's not going to sound great in the stock market.
That's probably going to be reflected on employees' morale, because once they see the stocks sinking because the growth is not happening, they end up being demotivated; a lot of the big money that they could make is not going to be made anymore.
So I empathize with the companies.
It's not easy to prioritize this until it becomes a bottom line.
Luckily, it is a bottom line right now, or it's becoming a lot more of a bottom line.
It's becoming more important for them to address these issues because they coud see—they are now able to see the immeasurable consequences on their brand, right?
Although their stock price, for instance, is at a high right now.
Yeah, because they are still operating based on—there isn't that much structural change that [has] happened yet, in my view.
Most of it is reactive.
Most of it has to do with trying to fix the problems that the media is talking about.
And I think it's well intentioned.
I'm not judging the intentions here.
I'm more questioning whether these changes are actually what are needed.
It's pretty hard to go for the needed changes.
They have serious consequences on the business.
But going back, I'm so curious about those kind of conversations that you were having in the early days, like through 2011, 2012.
Actually, the conversations I would have with more conviction in Silicon Valley, I would say probably 2013.
I started 2013.
During the period between 2011 and 2012, I was pretty busy with the day-to-day of Egyptian politics, and I was also less sure about anything.
Like there was one time, there was a page on in Egypt.
It had like, hundreds of thousands of followers.
All that it did was creating fake statements on behalf of politicians and public figures, and I was a victim of that page, where they wrote statements about me insulting the army, the Egyptian army, which on one hand puts me at a serious risk because that is not something I said.
So I don't want to own something I didn't say, and on the other hand, it plays to the propaganda machine that is working on smear campaigns against me.
I tried to get that—reported that page, and Facebook's response at the time was, "We can't verify whether this is a true statement or not, and we're not in the business of verifying that."
There was two sides of me.
One fully empathized, because I'm a technologist, and I understand that it is extremely hard for them to actually verify, but the activist in me was very disappointed, because, well, you built that technology; you'd better know how to manage it, because otherwise, people are going to be hurt.
I could be beaten up in the street because of that page, and you are responsible for that.
In a very direct way you are responsible, because you operated the platform that created such fake news that posed a threat on my own personal safety.
I've been always in this position of I hear the two sides, and I know how complex it is, and I know it's not as easy as writing that "Facebook as a company is evil; it's capitalistic; it's just trying to make money.
People there are waking up in the morning just trying to make money by manipulating our brains."
That's not what's happening.
But at the same time, the people who are building these platforms are not fully understanding the impact and the human involved.
Whatever they're doing, that impacts the lives of humans.
For them these are numbers—you know, 10 percent growth, 15 percent engagement—and it's very expensive to measure the immeasurable.
Did it dawn on you at the time as well, and obviously you liked Facebook at the time and saw it as a tremendous tool for good, but at the same time Facebook essentially was becoming the public space of the 21st century.
It was becoming a public space, but a privatized one, with its own incentive structures and its own economy of sorts.
Was that something you were hip to at the time or thinking about or concerned about?
Yeah, I'm definitely concerned about it, because I found myself in 2016 spending a lot of time and energy thinking about how to regulate social media, and there was a part of me that said, "Well, from using a certain tool to try and regulate governments, you are now going to governments trying to regulate these tools, and that's an irony."
But I thought maybe from the outside, it could seem like an irony.
But what I'm looking for is rebalancing the powers.
The power shift is so strongly now toward Facebook, they are not really accountable.
There are no checks and balances for the company.
We only deal with after the fact, after the mess....
We discovered many, many months later that Russian propagandists were using Facebook and creating Facebook pages, distributing content that is polarizing.
We understood many, many, many months later that there were advertising campaigns with the same objective, or that Cambridge Analytica managed to get a lot of access to our private data through what seems to be legal/illegal— legally, they had legal access to it, but they illegally stored the information and made use of it.
We're learning all these things after the fact, but these things are so crucial and critical that we should not be learning about them after the fact.
There need to be more checks and balances, and that's where I think the effort need to be invested.
I'm not in the business of questioning other people's intentions.
I like to believe that most of these actions are not driven by evil intentions.
They are driven by good intentions that are naive and uneducated, maybe arrogant, but they're not—as I said, no one is sitting down saying, "Oh, let me think about how to manipulate as a product manager at Facebook; let me think about how to manipulate the brains, hack the brains of billions of users."
No one is thinking that.
They're thinking, "How could we bring people more—How could we get them to engage in a positive experience that makes them love our brand and continue to use it for the next few years so we can sell them ads and make money?"
That's why I think the energy and effort should not be directed toward outrage.
It's like we're fighting outrage with outrage.
It should be directed toward: "OK, Facebook, there are things that need to happen.
In order for all of us to be part of this conversation, you have to share more data.
You have to be much more transparent.
You cannot be saying that we are all like equals at the table when you have all the data and none of the other stakeholders have access to that data."
And that's why I think transparency—it's not the solution, but it's a core element of approaching the problem so that we also deal with that actual problem and not with a delusional problem that someone is incentivized to create so that they could benefit from it.
But what's so strange is that here you were in 2011, and there was this uprising against a central power, a centralized power.
But at the same time, you were using a tool that essentially was centralizing power over the public space, or at least online.
Is that something—?
I did not think about it that way at the time, but now I do think about it that way.
I'm not opposed to centralization; I'm not opposed to decentralization.
I do think that there are times where centralization is a better tactic for a system to thrive, and I think sometimes [de]centralization is, and I think there are times where a hybrid model need[s] to exist.
So it's not about—What worries me is not about whether public space is centralized or not.
What worries me is the absence of checks and balances, is the absence of accountability, is the absence of being humble to understand that you know very little and you have to really study what you're doing before you do it.
We have to also be careful for what we are asking for, and we have to not criticize Facebook for being naive and arrogant and building products that have sophisticated impact on people's lives without paying attention, while we suggest solutions that fall under the same category of being naive and arrogant and not really understand the consequences of what we're asking for.
And that's where I think the trouble comes, right?
It is extremely hard.
There is this scenario—In [the] absence of data, it is extremely hard to be informed and to come up with solutions that work, and if Facebook is not providing the data, they are creat[ing] a scenario where they should be very accountable for everything that happens on their platform because they are the ones who are in the only position to fix it.
There's one thing that I want to talk about, about the algorithms, that actually I did not.
OK. All right. What is it?
I call the algorithms "mobocratic" because they empower the mobs.
They are not algorithms that optimize for objectivity and truth and the values of what I would say [is] a healthy society.
They are optimized for the anger, for the emotional outbursts, right?
The most ridiculous opinions are the ones that will get propagated for—.
One of my friends was telling me: "Oh, these congressmen and -women do not really know the internet. Their questions were ridiculous."
I looked at them, and I was like, "Have you actually watched a few hours of the whole hearing of Mark Zuckerberg?"
He said, "No, I just watched the clips on Facebook."
Basically, most of the clips on Facebook were the most ridiculous questions and answers.
They were not the actual dialogue.
Because I think there were very deep questions and deep answers as well from Mark during the session, but it's just because everybody [is] in the business of creating viral content that's outrageous, we just got the dumbest questions with the exception of one that was not that dumb, but dumbest questions propagated and that could create a wrong perception in the heads of people: "Oh yeah, these Congresspeople, they don't understand what they're talking about."
But there are actually ones that really understood what they're talking about.
And they put Mark in a position that is—I could feel from the energy of how he was speaking—that got him thinking.
Again, I just think that the nature of the algorithms, which as I was just saying, they are mobocratic.
They enable the mobs; they reward the mobs; they incentivize the mobs.
[It] is the structural problem of social media, and it's so structural and so hard to change because it's linked with the notion that advertising revenue is in correlation with engagement.
And if you try and basically optimize for what's true, what's not sensational, what's less polarizing, you are going to hurt engagement.
… One of the things, going back in time again, is that the basic premise was to unite people, right?
And then you're recognizing clues that this technology or this platform is not necessarily having that downstream consequence.
No, what I realized is that it does not only unite people; it could tear them apart, too.
I think if I'm building, if I'm contributing to this system, I'd put a goal of how could we see more of the unity and less of the polarization.
And again, it's not binary.
It's never going to be binary.
You're not going to achieve a state where, "Oh, this is a platform where everybody's united."
You want to be a platform for all ideas.
You want to be a platform for empowering people to have their voice and get it out.
But you also want to try and build for whatever gets the best version of people.
So you have to have a sense of direction of what—you know, I do think Facebook is a publication, whether Facebook likes it or not.
It makes billions of distribution decisions every day.
Its algorithm is designed by—the engineers are the proxy editors.
The engineers and product managers, designers, data scientists—they're all proxy editors.
They make algorithms that optimize for certain [types] of content.
So they are making editorial decisions, and the question here is, are these editorial decisions the best editorial decisions?
And the question of, the best for what?
Probably they are the best for Facebook—we all know that—on the short-term, because it does increase engagement; it does help grow the company's advertising revenues.
The argument that is hard to prove [is] that this could actually hurt the long-term, because Facebook does not want to eventually become a tabloid newspaper.
I would imagine its size is much bigger than just being a tabloid newspaper.
It has much more potential than that.
Therefore, I think it's more of a question of what do we do within this complicated system in order to make the algorithms less mobocratic, more meritocratic?
We operate in knowledge-based societies.
The more knowledge you have, the more you should get a voice, and the more you should demonstrate that knowledge in a nonpartisan way.
I'm not saying one group versus the other.
The question is how to build the algorithms to reward that.
You just don't—Enabling popular content is not necessarily good content.
It could be toxic content.
It could get people to do bad things.
And until we fully realize that and penalize bad content—First of all, identify bad content and what is bad content in a way that is nonpartisan, in a way that could have enough consensus, and then you start penalizing it so that you see less of [it].
Then also define good content and see how would you be able to reward it.
I don't think we're there yet.
I also don't think that we're paying enough attention to the algorithms.
We're talking more about fake news, because these are easier topics to understand.
They're more viral; they create a much more simple narrative.
"Oh, yeah, Facebook is promoting fake news.
Hey, Facebook why don't you stop all the fake news on your platform?"
But, you know, what is fake, OK?
Is "person X did event Y" fake?
How do you know if it's fake?
Because there are cases that are clearly fake news, and you could spot them.
But there are much less nuanced cases, where even the best editor in a newspaper would not be able to say if it's fake or not.
So how do you deal with that?
And I think the way I would approach this problem is to actually go back to the fundamentals.
You're building an economic system with an incentive structure.
Just go review these, because there is a bug there, and the bug is leading to a lot of unintended consequences that none of you want to see happening, none of the stakeholders within the system want to see happening.
Therefore, it's probably better to tackle that and fix it.
So those conversations that you were having in 2011 and 2013?
No, after 2013.
After 2013, were you saying to engineers and others in Silicon Valley there are serious unintended consequences to what you've designed here?
Yeah, that's basically—my English was not as articulate as yours, but yeah, I was saying that.
You were saying that.
And the response was what?
Was there concern?
The response was what?
Their response was, "Yeah, we hear you," like, "Yeah, maybe, but then what do we do?
Is it engagement?"
It's almost like I was thinking about creating a version of what I call frequently asked questions and answer[ing] them all, because there are things like, "Isn't that what the user wants?"
One of the arguments I've always had is, "Why are you in a position to—Do you want to create an editorial, like, 'What Wael thinks' as the core of how the algorithms should operate?"
And I'm like, "Absolutely not."
But I think that the engagement-driven algorithms are not the answer either.
They realize—they understand that there might be unintended consequences.
They are much more skeptical than me about them, because no one wants to think that they're waking up every morning creating bad, unintended consequences in the world.
Even if you are working in a smoking company, you're probably not—you will have to have less belief in "Smoking kills."
You will have to convince yourself that this is all too much, and, you know, "My uncle has smoked until they died at 90, and they were completely fine."
No one wants to admit that what they're doing is bad.
I got a lot of that, and I also got some empathy.
But then people have day-to-day problems to solve, and that was not their bottom line.
They're not going to go out of their way, dropping engagements and not [being] celebrated within their teams, and go through their six-month' performance appraisal and be proud of dropping engagement on the platform [by] 20 percent.
No one is going to do that.
Everyone wants to see what the leadership of the company wants to see [in] the company, and they find their role within so that they could thrive within the organization.
That's why I understand the complexity of it and how hard it is, and I'm always empathetic of how hard this problem is, especially because it's a new problem, especially because we have not built enough nuance yet; we don't understand this yet.
We created a monster.
I mean, "a monster" is an outrageous term.
We just created something that's way bigger than any of us.
I don't think Mark himself has ever imagined that Facebook is going to go this far, and I don't think that he wants to see it harming anyone.
He actually wants to see it as a platform for good.
Right.
But at the time, you were saying that it's not simply a platform for good; that if you're optimizing for engagement, and that's what your algorithms do, then there are harmful consequences, and it can create divisions as much as it can create unity.
Yeah, and most of the responses are, "Yeah, there is bad, but the good outweighs the bad."
… And then you got sunk into the PR narratives of these companies.
It's just, again, you go into a version of casino capitalism, everyone is celebrating and want[s] to feel good about themselves, and we're not responsible for these problems.
Even technically, you can't also directly link the problems to the platforms.
There is a bit of exaggeration that takes place.
Even I'm aware of my own exaggeration toward how the platforms are responsible for these problems.
Unfortunately, the narrative that everyone looks for is simple: you know, how to point fingers, who to point fingers to, and what's the simple message you have.
Life is far more complex than this.
Can a company of this size and reach and power and scale actually solve these problems, or is the problem itself its scale?
I think that companies are like humans.
We are never going to be perfect, but there are people who strive to be perfect or strive to be good.
And in this journey, they make a lot of mistakes, and they learn, and they improve, and there are ones that just don't care; there are ones that continue to be who they are and they're happy with whatever they are doing regardless of the consequences.
My general philosophy here is that yes, companies could always improve, and their scale is not—it should not be used as an excuse.
It's actually power.
Facebook as a company or YouTube or any of these social media companies make billions of U.S. dollars in revenues and profits, and they could [put] some of these resources back into creating more responsible technologies.
And one of the things I keep thinking about a lot recently is why aren't there enough social scientists within these platforms that are included, not in the aftermath, not in like "Let's analyze the data coming out of the products we're building," but more like: "Let's actually be part of the product-development cycle.
Let's actually have conversations with the social scientists who are empowered to help us understand what could be the potential blind spots that we are having when we're dealing with these products that changes and alters the behavior of humans."
Why aren't we having enough of those?
Why aren't we having enough of people who understand different languages, since we are operating in a certain country with a certain language?
Why aren't we pouring enough resources, or hiring enough resources to help?
I understand the arguments, by the way, for things like these.
For example, the social scientist one, that slows down the product-development cycle, and you want to build, because the culture here—that needs to be changed, by the way—is to build fast and break things.
And I don't think you should—at the size of Facebook, you should not build fast, and you should be very careful as you break things, because when you break things, you could be causing a lot of damage and unintended consequences.
It's also a matter of the bottom line.
For example, people who [do] operations as cost centers—until we stop thinking of them as cost centers, it's going to be very hard to justify hiring a lot of resources on that.
Facebook actually did talk about their—you know, they are working on increasing their operational teams that take care of all these abuse issues and—safety teams and community teams.
That's a great step.
I don't think the problem here is that we can't solve these problems, Because I do think it's not binary, as I said; You just strive to be creating less problems as much as you can.
I think the real problem here is more of focus.
If you are only focused and consumed by the short-term nature of these businesses and corporations and not the long-term implications on the society, then you are a great member of the casino capitalism game, and you are definitely contributing to making the world a worse place.
If you want to be a member of the people who want to make the world a better place, you have to think more about the long-term consequences.
And it's extremely hard.
It's easier for me to say that here as I'm sitting down and I don't own the accountability to whatever solutions that I would propose.
But it is inevitable for them, running such a massive-audience platform.
It is inevitable for them to invest much more heavily on this.
I would imagine this is a bottom line for the society.
It might not be a bottom line for the company right now, or as much of a PR crisis.
It's more of a PR crisis than a bottom line for the company.
It's starting to be one, but it is definitely a bottom line for the society, because election results will be—could be altered using these tools.
Politics could be hacked using these tools, and social norms, social rest could be also disrupted.
So these are really serious problems that need real, serious dedication toward solving them if we care about the society.
When you were telling the story about starting the initial page, you said that it was June 10, 2011.
'10.
And you misspoke.
Can you just say again—It was June 10, 2010?
In 10th of June 2010, I was working at Google at the time in Dubai.
And I was in my study room, and as I was browsing my Facebook feed I saw a photo of a tortured person.
Looks young, and I just couldn't believe what I saw because it was a corpse—It was a dead person, and it says the police [had] beaten him up to death.
I normally try and question everything I see on social media and I try and verify.
So I was like this is too bad to be true.
It's so crazy that the photo is—It was crazy and I started looking online and trying to figure out if this is actually a true story.
This guy's really Egyptian.
I started learning his name is Khaled Said, he's an Alexandrian.
And he was chased by two police officers who ended up beating him up to death.
And I remember crying in my study room because I felt that, you know, there is no price for the police force to pay for such actions.
And as long as they don't pay for it they're going to continue doing it.
In any dictatorship, those who exercise the force on people are barely accountable for any bad actions.
I felt very helpless.
And I also felt that I could be this person, I could be Khaled, I could be walking in the street.
I could get politically engaged and get arrested and beaten up and tortured or killed and nothing, nothing happens.
At that time, I wanted to do something about it.
I did not want to just like sit down and watch from afar and grieve and move on because that has been happening for years.
I decided to create an anonymous page on Facebook.
I did not really see that as something big or—I just wanted to express myself.
I felt like I have something that is important to say.
It was mainly for me, I did not want to look back and say that happened, and I just didn't do anything about it.
And in just three days this page and I named the page "We're All Khaled Said"- because I wanted to tell everybody you could be Khaled the next day- probably today you could be Khaled, you just don't know.
And I created the page and I for the first few posts I was actually writing in the name of Khaled, like I was- as if Khaled is alive or speaking from his grave.
… With that all in mind I wanted to try and create an environment where we could get closer together and increase our voice and asked the regime to take us more seriously.
I do have to have to ask you this.
Was the movement—did the movement fail?
Well, I think, the way I see it now is it was too naive to celebrate on the 11th of February 2011, because a 60 years' dictatorship cannot be—or even longer.
I mean, the first time we've had elections in Egypt that were fair and democratic was in 2012, when Mohamed Morsi won by 51 percent.
He barely lost the election to his opponent.
It's also equally naive to think that it failed as of today, because what happened was a massive disruption of the balance of powers.
It's true we're in a downturn now—actually, things are far worse than they used to be in 2010—but I'm not one of those people that thinks everything failed.
I think that's a good consumable media narrative, just like the good consumable media narrative that evolved in 2011.
These things will take time.
It's not over yet.
It will come in different shapes and forms, and when I say it's not over yet, it's not like I want chaos or—No, what I'm saying is this disruption—the consequences of this disruption are not fully realized yet.
I don't know what's going to happen in Egypt two years from now.
And that is a change, because 20 years ago, you would know what's happening—what's going to happen to you.
It's just going to be the same.
Everything is status quo.
The regime has forced everyone to abide by their norms, and it was more of an oligarchy, where very few number of people are sucking most of the resources and silencing everyone else.
-------------------------------------------
Nik Angyal Named Academic All-American of the Year - Duration: 2:14.[Nik Angyal] This season has been a great year for the U of R. I mean, we had a great here last
year as well. We went as far as we'd ever been to that point, getting to the Elite
Eight and we lost some key players from that team, but we rebuilt and retooled just
like we do every year and the guys have really stepped up to take the place of
the guys we lost last year. And this year we're in the Final Four, which is
incredible. I mean, it's the farthest U of R has ever been and it's really
exciting to go on down in Greensboro and trying to compete for a national
championship. As a student of chemical engineering here, I've had the
opportunity to get involved in some undergraduate research, which has been
really exciting. This past summer I worked in a lab under a professor in the
chemical engineering department where I was developing a film that was going to
be used in lithium-ion batteries, which are rechargeable batteries. And it was
really exciting to be a part of research that is going to be hopefully in use in
the industrial world some day. I'd say that my interest in chemical engineering
probably started in high school back when I took AP chemistry. I had a really
good teacher and she really inspired me to learn more
about chemistry and and start to explore how the world works which is really what
I find interesting and what chemistry is all about. So, coming into college I knew
I wanted to say something in chemistry and then I want to be a little more on that
practical side of things, so I want to go into applications of chemistry, which is
why I chose chemical engineering.
It's a very humbling experience to be named Google Cloud Academic All-American
of the Year for Division III men's soccer. Obviously, I was named to the second-team
last year and that was very exciting and then this year when I found out that I
was the academic all-american of the year, I was a little speechless to be honest. It's a
really high honor and I think it's a good culmination of my efforts
both on the field and in the classroom and it's very very exciting to be to be honored this way.
-------------------------------------------
El Cambio Climático es Culpa Nuestra y Puedo Convencerte - Duration: 12:52. For more infomation >> El Cambio Climático es Culpa Nuestra y Puedo Convencerte - Duration: 12:52.-------------------------------------------
Culiacán recibió a Gerardo Ortiz con los brazos abiertos | Un Nuevo Día | Telemundo - Duration: 3:58. For more infomation >> Culiacán recibió a Gerardo Ortiz con los brazos abiertos | Un Nuevo Día | Telemundo - Duration: 3:58.-------------------------------------------
New YouTube Channel Intro | Interracial Couple - Duration: 1:58.This is our story
I'm Sam
and I'm Dan
Dancing brought us together
After 2 years of dating, we got married
We had our fair share of struggles and challenges
as an interracial couple
From food
values
and friends
I love the outdoors
and I love Netflix
I love meat
and I like salad
But in the end, love always prevails
We travelled the world for 6 months
From the World Cup in Russia
to the beautiful coastline of Croatia
From the seafood markets in Japan
to the crazy street food adventures all over China
and suddenly
Oh my God!
Do you see two lines?
Yeah!
I'm pregnant!
This is our life!
Sam and Dan
Welcome to our channel!
-------------------------------------------
Flüchtlingskarawane: Migranten nach Grenzübertritt in die USA festgenommen - Duration: 2:56. For more infomation >> Flüchtlingskarawane: Migranten nach Grenzübertritt in die USA festgenommen - Duration: 2:56.-------------------------------------------
Article 13 SOLUTION (This Could Work!) - Duration: 3:42.hey it's Hoz here and I think I have a solution to article 13. Now if you don't
know what article 13 is, where have you been?
There's a link to the website behind me in the description below this video - go
and check it out because it's super important, it affects you and it affects
me and every content creator and the public at large. Essentially very briefly
the European Parliament has passed this motion called article 13 and it will
force all member countries in the EU to pass the law or pass it into a law
within two years and it's all about copyright and the gist of it if I'm
right is that they will make platforms like YouTube and Facebook and so on
liable for every piece of content that anyone updates and this is with regards
to copyright etc. Now the problem with that (can you spot the problem?) is that in
YouTube for example there are millions well...
there are hundreds or thousands of videos uploaded every single second so
it's impossible to police and I think the general consensus is that what's
going to happen is that these platforms will simply have to block the content
from content creators just to be on the safe side otherwise they're gonna get
sued. So that means I'm in the UK and potentially everyone in the U.S. won't
be able to watch my content - there'll be some kind of screen that says you can't
watch this - and that may affect us vice versa as
well we might not be able to watch content from US content creators and
anywhere else around the world and this is a bad thing okay imagine all the
favorite things that you like to watch on YouTube you suddenly can't watch! So
my solution is this: what if YouTube (Google) puts together an agreement
for every content creator and that Agreement says something along the lines
of "I will take responsibility for my own content so I absolve you of being
responsible for me and I'll take of that responsibility". Well, I'd sign that! Of
course that means that I can post content and if there's a problem with
that content copyright wise then YouTube won't get in trouble because I signed that
agreement. I think this could work. It needs to be a simple agreement that
focuses only on that - I'd hate for this to be something that other big platforms
use to you know let's say take advantage of creators and create this
big long agreement that includes other stuff you know. I think what would work
is a short agreement that is just about that article 13 and if there was one
such agreement that said you know sign here and that means you absolve us of
the responsibility of the content of your videos then I'd sign it if it gets
past article 13. So that's my solution. Anyway I don't know who to contact
because I don't know anyone but maybe you do so just pass this on and let's
see what happens.
It's important, it affects you, it affects me, it affects every content creator and
the public at large... oh! I almost fell then...
-------------------------------------------
Honor 8x - Лучше, чем Xiaomi - Duration: 5:48. For more infomation >> Honor 8x - Лучше, чем Xiaomi - Duration: 5:48.-------------------------------------------
Smart TV LED 50" Philips 4K 50PUG651378 - Duration: 1:19. For more infomation >> Smart TV LED 50" Philips 4K 50PUG651378 - Duration: 1:19.-------------------------------------------
TRENDY PIXIE HAIRCUT COLOR BY TOP STYLIST VIVYAN HERMUZ - Duration: 3:49.
Không có nhận xét nào:
Đăng nhận xét