And since none of them have equity in OpenAI, their external financial interest would influence decision making, especially when those interests lie with a competing company where a board member is currently the chief executive.
I've seen too much automatic praise given to this board under the unbacked assumption that this decision was some pure, mission-driven action, and not enough criticism of an org structure that allows a board to bet against the long term success of the underlying organization.
I'm going to assume that he's referring to Tasha and Helen.
I don't know if that is accurate, or even fair - the only thing I can see, is that there's very little open information regarding them.
From the little I can find, Tasha seems to have worked at NASA Research Park, as well as having been CEO for startup called Geo Sim Cities. Stanford and CMU alumni? While other websites say Bard college and University of Southern California.
As for Helen, she seems to have worked as a researcher in both academia and Open Philanthropy.
To me at least that's an _extremely_ rude thing to do. (Unless one person is asked to do it this way, the other one that way, so people can compare the outcome.)
(Especially if they aren't made aware of each other until the end.)
I think this needs to be viewed through the lens of the gravity of how the board reacted; giving them the benefit of the doubt that they acted appropriately and, at least with the information they had the time, correctly.
A hypothetical example: Would you agree that it's an appropriate thing to do if the second project was Alignment-related, Sam lied or misled about the existence of the second team, to Ilya, because he believed that Ilya was over-aligning their AIs and reducing their functionality?
Its easy to view the board's lack of candor as "they're hiding a really bad, unprofessional decision"; which is probable at this point. You could also view it with the conclusion that, they made an initial miscalculated mistake in communication, and are now overtly and extremely careful in everything they say because the company is leaking like a sieve and they don't want to get into a game of mudslinging with Sam.
> giving them the benefit of the doubt that they acted appropriately
Yet you're only willing to give this to one side and not the other? Seems reasonable... Especially despite all the evidence so far that the board is either completely incompetent or had ulterior motives.
She only founded one, Fellow Robots, and that "company" went nowhere. There's no product info and the company page shut down. She was CEO of GeoSim for a short 3 years, and this "company" also looks like it's going nowhere.
She has quite a track record of short tenures and failures.
> She has quite a track record of short tenures and failures.
It may be good to have a failure perspective on a board as a counter-balance. I don't think this is a valid knock against her. She has relevant industry experience at least.
> wait so can't SA sue for wrongful termination if everything is as bogus as everyone is saying?
It is breach of contract if it violated his employment contract, but I don't have a copy of his contract. It is wrongful termination if it was for an illegal reason, but there doesn't seem to be any suggestion of that.
> same for MS
I doubt very much that the contract with Microsoft limits OpenAI's right to manage their own personnel, so probably not.
That's the default, but employment contracts can override this. C-level employment contracts almost universally have special consideration for "Termination Without Cause", aka golden parachutes. He could sue to make them pay out.
He would also have very good grounds for a civil suit for disparagement. Or at least he would have if Microsoft didn't immediately step up and offer him the world.
You mean like being fired by a board member as part of their scheme to breach their fiduciary duty by launching a competitive product in another company?
Have these people never worked at any other company before? Probably every company with more than 10 employees does something like this.