In Brief
- Robinhood CEO Vlad Tenev asserted that tokenizing stocks for private companies, including OpenAI, represents a significant milestone despite controversy.
- OpenAI subsequently denounced Robinhood’s unauthorized tokenization initiative.
- The token offered price tracking access but did not represent actual equity or voting rights.
Robinhood CEO Vlad Tenev acknowledged that the company’s tokenization platform faced controversy but maintained it fulfilled its mission, calling it a “big milestone.” Describing Robinhood’s first offering of tokenized stocks as “controversial” was an understatement, according to Tenev.
Earlier this month, following criticism from entities like OpenAI, the trading app introduced an Ethereum-based giveaway of OpenAI stock tokens on the Arbitrum network. However, OpenAI forcefully rejected the initiative shortly after its announcement, vowing to halt the operation.
The OpenAI token distributed almost exclusively in Europe tracked metrics like secondary market trading prices but did not distribute actual company stock or confer voting rights. Tenev emphasized this was the essence of the offering.
Despite pushback from OpenAI and regulatory notifications, Tenev remains bullish on tokenized stocks. “Now it’s just about expanding it to more companies,” he stated, indicating ambitions to scale the platform thousands-fold. He also referenced Robinhood’s longer-term plan to integrate these instruments with decentralized finance (DeFi) markets.
The launch drew SEC scrutiny shortly after market participants noted the authority of trading platforms like Robinhood to unilaterally tokenize securities. SEC Commissioner Hester Peirce reminded that “tokenized securities are still securities,” thereby remaining subject to existing securities regulations.
Robinhood appears to be firmly in the process of institutionalizing its tokenization project regarding both the breadth of available companies and the depth of available financial products. Tenev downplayed the speculation surrounding the SEC’s intervention, noting it represented a general expression of the regulator’s stance on secure tokenization.