It’s been a minute since we last heard from Chris Cox, the man who at one time was, as WIRED put it, “effectively in charge of product for four of the six largest social media platforms in the world.”

As Facebook’s longtime chief product officer, Cox has effectively been Mark Zuckerberg’s “chief of staff for executing product strategy” since at least 2014. Zuckerberg might be the “face” of Facebook, but Cox is, per the company’s own employees, its “heart and soul.” Under his steady hand, the platform grew into a global behemoth with more than two billion users and a market value of more than $500 billion. He played a major role in the creation of News Feed and, somewhere along the way, was handed the reins of WhatsApp, Messenger, and Instagram as well. 

It is not an exaggeration to say that, by 2019, Cox was the most powerful chief product officer (CPO) in the world. In an era where the line between digital life and real life is almost nonexistent, that made him one of the most powerful people in the world, full stop.

But then something went awry. Zuckerberg announced a shift toward end-to-end encryption and integrated the four apps in Cox’s portfolio under one blanket. Shortly thereafter, in April 2019, Cox left the company. He was reticent to share the reason for his departure, but we know that Cox was an advocate for limiting toxicity and preserving the safety and well-being of Facebook users. So, it is not difficult to imagine at least one reason for why he left: increased encryption, while enhancing privacy, makes it more arduous than ever for Facebook to curb hate speech, human trafficking, conspiracy theorizing, terrorist plotting, and all the other terrible behaviors that have turned it into a democracy-threatening cesspool.

But then, barely a year after his resignation, Cox returned to his CPO job, citing an urge to “roll up my sleeves and dig in to help” in the face of a “public health crisis, an economic crisis, and now a reckoning of racial injustice.” It was a bafflingly quick turnaround. If anything, Facebook had become a far more fraught place to work during Cox’s absence, shouldering more and more of the blame for societal maladies ranging from rampant misinformation, to tech addiction, to election tampering. Antitrust investigations – both at home and abroad – swirled around the company like a tornado, and Facebook’s own employees were in a state of perpetual discontent.

All of these monumental challenges seemed only to motivate Cox even more. “In the past month the world has grown more chaotic and unstable, which has only given me more resolve to help out,” he wrote. Since then, other than an update here and there on his Facebook page, he has not been very visible – which is a shame. As the chief product officer of the world’s biggest social media platform, Cox remains one of the world’s most influential technology executives. We would like to hear more from him. Per his own staff, Cox is “everything Mark wished he could be”, and is renowned for his ability to compellingly explain what the company is up to, for better or for worse.

We can’t seem to get serious answers from Zuck. By contrast, taken at his word, Chris Cox seems to instinctively want to help make things better. So here are five questions we’d like to ask Chris Cox:

1) The election has come and gone and the actual balloting went pretty smoothly, at least relative to the dystopia of chaos and violence that many feared. But with tight results and delayed counting in several states, the election misinformation machine began firing on all cylinders. Facebook’s response to it was limited to adding a notification to posts that included “premature claims” of victory by candidates, and to putting limits on political advertising. Nevertheless, Facebook was rife with misleading and threatening posts and misinformation in the aftermath of Election Day.

Do you, Chris Cox, really feel your company’s response to misinformation on and after the election was sufficient when the stakes were so high for our country? Do you really feel you did enough when a key pillar of our democracy – a free and fair election – was under unprecedented attack?

2) The much-touted Facebook Oversight Board has finally begun reviewing high-impact content moderation claims, including some recent cases involving nudity, incitement of violence, and hate speech. The board writes that once it has ruled on these items, “Facebook will be required to implement our decisions”.

Chris Cox, what will that implementation look like? One of the claims, for example, is over the removal of a video and comments criticizing France’s refusal to authorize certain controversial drugs for the treatment of COVID-19. Let’s say the Board agrees this post should, in fact, be removed – as Facebook has already done based on violation of its Violence and Incitement policy. What happens next? Other than keeping the post down, how do you foresee “implementing” the Board’s decision in future product developments? As the Board makes specific decisions on complaints (some of which may already be moot because they are no longer burning issues), how will Facebook turn the lessons of these cases into actionable, positive, systemic changes at Facebook?

3) You left Facebook following Zuckerberg’s announced plan to focus on an “encrypted, interoperable, messaging network.” You never explained why (so far as we are aware), but some have speculated it is because your work to prevent hate speech, conspiracy theories, human trafficking, and other harms on the platform would be jeopardized by an infrastructure that could make Facebook’s most toxic content invisible – even to Facebook. In any case, you are back in the saddle again, but we haven’t heard how you are squaring your concerns about encryption with your efforts to prevent harmful speech.

So, what has changed? Have you made your peace with encryption? How will it affect your efforts against harms on Facebook’s platforms?

4) In 2008, you served, for a short time, as Facebook’s director of human resources. You later said of the experience that it taught you that “we don’t need innovation in the field of HR and recruiting, we just need competent managers.” Since then, in the wake of scandal after scandal, employee dissent has become commonplace at Facebook

If you were leading HR today, how would you handle these concerns? What would you do to fix a company culture that isn’t just rankling current Facebook workers but jeopardizing its incoming talent pipeline?

5) And lastly, Chris, a question near and dear to our hearts. Facebook, and your Big Tech peers, report that you have committed substantial resources to meet the biggest headline challenges you face about content moderation. No doubt about it, it is difficult to make fast and fair decisions about highly subjective material without, as Zuckerberg often laments, becoming “arbiters of truth.” Heck, you even spent $130 million establishing the aforementioned independent oversight board to try to [nothing less than]answer some of the most difficult questions around freedom of expression online. 

Our question is this: if you can put this much time and effort and spending into the difficult content moderations issues, why can’t you commit more resources to better clean up the easy stuff – i.e., reducing the incidents of piracy on your platform? You are willing to deauthorize groups that engage in bad behavior – why not extend that to the tons of “free movie” groups on your platform where the members aren’t just linking to external piracy sites but actively hosting the full uploaded films on your platform. That kind of behavior is awfully obvious and doesn’t require wrestling with “truth.” Piracy is a crime and it is costing the U.S. economy at least $29.2 billion every year. Why is it still on your platform?