• 0 Posts
  • 8 Comments
Joined 1 year ago
cake
Cake day: June 6th, 2023

help-circle
  • I wish media would give credit to the organization leaders responsible for these types of moves rather than crediting a homogeneous “Biden administration.” The fact is that the administration does deserve credit for employing a number of “progressive” (read: competent) administrators, but those departments compose a progressive wing of the administration that is not on par with some of the overall administration’s more centrist leanings.

    Personnel are policy, something that the Biden administration has proved again and again since the 2020 election. Biden himself is a kind of empty vessel into which different wings of the Democratic party pour their will, yielding a strange brew of appointments both great and terrible.

    -- Cory Doctorow


  • I…don’t think that’s what the referenced paper was saying. First of all, Toner didn’t co-author the paper from her position as an OpenAI board member, but as a CSET director. Secondly, the paper didn’t intend to prescribe behaviors to private sector tech companies, but rather investigate “[how policymakers can] credibly reveal and assess intentions in the field of artificial intelligence” by exploring “costly signals…as a policy lever.”

    The full quote:

    By delaying the release of Claude until another company put out a similarly capable product, Anthropic was showing its willingness to avoid
    exactly the kind of frantic corner-cutting that the release of ChatGPT appeared to spur. Anthropic achieved this goal by leveraging installment costs, or fixed costs that cannot be offset over time. In the framework of this study, Anthropic enhanced the credibility of its commitments to AI safety by holding its model back from early release and absorbing potential future revenue losses. The motivation in this case was not to recoup those losses by gaining a wider market share, but rather to promote industry norms and contribute to shared expectations around responsible AI development and deployment.

    Anthropic is being used here as an example of “private sector signaling,” which could theoretically manifest in countless ways. Nothing in the text seems to indicate that OpenAI should have behaved exactly this same way, but the example is held as a successful contrast to OpenAI’s allegedly failed use of the GPT-4 system card as a signal of OpenAI’s commitment to safety.

    To more fully understand how private sector actors can send costly signals, it is worth considering two examples of leading AI companies going beyond public statements to signal their commitment to develop AI responsibly: OpenAI’s publication of a “system card” alongside the launch of its GPT-4 model, and Anthropic’s decision to delay the release of its chatbot, Claude.

    Honestly, the paper seems really interesting to an AI layman like me and a critically important subject to explore: empowering policymakers to make informed determinations about regulating a technology that almost everyone except the subject-matter experts themselves will *not fully understand.




  • This is so fucking exhausting.

    Lee – who went from hoping for the appointment to, in recent weeks, making a political issue out of knocking Newsom on the assumption she wouldn’t get it – spent Monday and Tuesday reaching out to fellow members of the Congressional Black Caucus to urge them to stick with her, even though there is now another Black woman in the spot. Schiff’s initial response was to trumpet the big lead he has in fundraising, which aides were hoping would get both Butler’s attention and that of reporters busy assessing her chances. California political insiders have noticed anti-Butler opposition research appearing and a new anti-Butler account on X, and have been pointing fingers over who is behind them. False rumors that Newsom offered others the appointment first have been floated, too.

    Patting backs, making nonsense announcements to get media attention, oppo research… I mean, I’m not naive, this is the way things go. But we’re never going to get the best-qualified people to serve in government while campaigning requires this much machination unrelated to the actual merit of the candidates. Maybe some time around our evolution to a full Type 1 civilization we’ll have figured this out.