A large motivation for this move is likely to ensure that attempts by some incumbent ISAs to lobby the US government to curb the uptake of RISC-V are stymied.
There appears to be an undercurrent of this sort underway where the soaring popularity of RISC-V in markets such as China is politically ripe for some incumbent ISAs to turn US government opinion against RISC-V, from a general uptake PoV or from the PoV of introducing laborious procedural delays in the uptake.
Turning the ISA into an ISO standard helps curb such attempts.
Ethernet, although not directly relevant, is a similar example. You can't lobby the US government to outright ban or generally slow the adoption of Ethernet because it's so much of a universal phenomenon by virtue of it being a standard.
Then, there's NASA, and their rad hard HPSC RISC-V. It's a product now, with a Microchip part number (PIC64-HPSC1000-RH) and a second source (SiFive, apparently.) I suppose it's conceivable the a Berkeley CA developed ISA that has been officially adopted as new rad hard avionics CPU platform by the US government's primary aerospace arm could get voted off the island in some timeline, but it's looking fairly improbable at this point.
Only time will tell if it ends like: "to avoid someone else shooting us, let's shoot ourselves".
Dedicated consortiums like CNCF, USB Implementers Forum, Alliance for Open Media, IETF, etc are more qualified at moving a standard forward, than ISO or government bodies.
> There appears to be an undercurrent of this sort underway where the soaring popularity of RISC-V in markets such as China is politically ripe for some incumbent ISAs to turn US government opinion against RISC-V, from a general uptake PoV or from the PoV of introducing laborious procedural delays in the uptake.
> Turning the ISA into an ISO standard helps curb such attempts.
Why do you think that would help? I fail to see how that would help.
An ISO standard is hard to gepolitically regulate, I would think.
It also cements the fact that the technology being standardized is simply too fundamental and likely ubiquitous for folks to worry about it being turned into a strategic weapon.
Taking the previously mentioned ethernet example (not a perfect one I should accentuate again): why bother with blocking it's uptake when it is too fundamentally useful and enabling for a whole bunch of other innovation that builds on top.
Not only that, it might turn RISC-V from a specification freely available under a FOSS license into a proprietary standard that you have to pay 285 CHF (~$350) to buy a non-transferable license for.
> The RISC-V ISA is already an industry standard and the next step is impartial recognition from a trusted international organization.
I'm confused. Isn't RISC-V International itself a trusted international organization? It's hard to see how an organization that standardizes screws and plugs could possibly be qualified to develop ISAs.
ISO defines standards for much more than bolts and plugs. A few examples include: the C++ ISO standard, IT security standards and workplace safety standards, and that’s a small subset of what they do.
They develop a well defined standard, not the technologies mentioned in the standard. So yes, they’re qualified.
Titanic is not an example of why building ships has to be avoided. C++ is a great example, yes, of the damage ambitious and egotistical personas can inflict when cooperation is necessary.
It is certainly an example of why SC22 is a bad idea
The "C++ Standards Committee" is Working Group #21 of Sub Committee #22, of the Joint Technical Committee #1 between ISO and the IEC.
It is completely the wrong shape of organization for this work, a large unwieldy bureaucracy created so that sovereign entities could somehow agree things, this works pretty well for ISO 216 (the A-series paper sizes) and while it isn't very productive for something like ISO 26262 (safety) it can't do much harm. For the deeply technical work of a programming language it's hopeless.
The IETF shows a much better way to develop standards for technology.
The fact that the C++ committee is technically a subgroup of a subgroup of a subgroup is among the least of the issues of ISO for standardization.
The main problem is that ISO is a net negative value-add for standardization. At one point, the ISO editor came back and said "you need to change the standard because you used periods instead of commas for the decimal point, a violation of ISO rules." Small wonder there's muttering about taking C and C++ out of ISO.
I would argue that the structural problem is an underlying cause. So it won't be the proximate cause, but when you dig deeper, when you keep asking like a five year old, "But why?" the answer is ultimately ISO's structure and nothing to do with Bjarne's language in particular.
Hence the concern for the non-language but still deeply technical RISC-V standardization.
FTA: “Since 1987, JTC 1 has overseen the standardization of many foundational IT standards, including JPEG, MPEG, and the C and C++ programming languages”
Compared to ISO, RISC-V International has almost no experience maintaining standards.
Even if you think that’s isn’t valuable, the reality is that there is prestige/trustworthiness associated with an “ISO standard” sticker, similar to how having a “published in prestigious journal J” stickers gives scientific papers prestige/trustworthiness.
> It's hard to see how an organization that standardizes screws and plugs could possibly be qualified to develop ISAs.
you my friend have not delved into the rabbithole that is standardisation organizations.
ISO and IEC goes so far beyond bolts and screws it's frankly dizzying how faar reaching their fingers are in our society.
As for why, the top comment explained it well; There is a movement to block Risk-v adoption in the US for some geopolitical shenanigans. A standardisation with a trusted authority may help.
Government agencies like to take standards off the shelf whenever they can. Citing something overseen by an apolitical, non-profit organization avoids conflicts of interest (relative to the alternatives).
That’s the definition of throwing the baby out with the bath water.
Is ISO as an organisation imperfect sometimes (as in the docs case) sure?, it’s composed of humans who are generally flawed creatures, is it generally a good solution despite that?, also sure.
They’ve published tens of thousands off standards over 70 plus years that are deeply important to multiple industries so disregarding them because Microsoft co-opted them once 20 odd years ago seems unreasonable to me.
Office Open XML, the standard behind .docx and other zipped XML formats, was fast-tracked into the international standard without many rounds of reviews (by the same JTC 1!).
> “International standards have a special status,” says Phil Wennblom, Chair of ISO/IEC JTC 1. “Even though RISC-V is already globally recognized, once something becomes an ISO/IEC standard, it’s even more widely accepted. Countries around the world place strong emphasis on international standards as the basis for their national standards. It’s a significant tailwind when it comes to market access.”
Says that, but I don't agree with that. If anything it would have been less successful being picked up in discount markets if the specs weren't free for download, and I don't know what fringes they're trying to break into but probably none of them care whether the spec is ISO.
That can depend on how the spec gets made into an ISO standard. There is a process called "harvesting" that can allow the original author to continue to distribute an existing specification independently of ISO.
Usual lies. There are a plethora of largely ignored international standards. Making it an international standard is just one of many ways to achieve the wide worldwide acception and still has a high failure rate.
My take is that it could help tie up fragmentation. RISC-V has different profiles defining what instructions come with for different use cases like a general purpose OS, and enshrining them as an ISO standard would give the entire industry a rallying point.
Without these profiles, we are stuck with memorizing a word soup of RV64GCBV_Zicntr_Zihpm_etc all means
fossilised is often desirable or requested in some industries. Developing for the embedded market myself, we often have to stick to C99 to ensure compatibility with whatever ancient compiler a costumer or even chipset vendor may still be running.
I wouldn't say it never had a problem, but the profiles are definitely a reasonable solution.
However even with profiles there are optional extensions and a lot of undefined behaviour (sometimes deliberately, sometimes because the spec is just not especially well written).
The FUD keeps being brought up, but the solution here was in place before the potential issue could manifest.
It started with G, later retroactively named RVA20 (with a minor extra extension that nobody ever skipped implementing), then RVA22 and now RVA23. All application processor implementations out there conform to a profile, and so do the relevant Linux distributions.
Of course, in embedded systems where the vendor controls the full stack, the freedom of micromanaging which extensions to implement as well as the freedom to add custom extensions is actual value.
The original architects of the ISA knew what they were doing.
Governments seem to care about "self-sufficiency" a lot more these days, especially after what's happening in both China and the US right now.
If the choice is between an architecture owned, patented and managed by a single company domiciled in a foreign country, versus one which is an international standard and has multiple competing vendors, the latter suddenly seems a lot more attractive.
Price and performance don't matter that much. Governments are a lot less price-sensitive than consumers (and even businesses), they're willing to spend money to achieve their goals.
This is exactly what makes this such an interesting development. Standardization is part of the process of the CPU industry becoming a mature industry not dependent on the whims of individual companies. Boring, yes, but also stable.
>On May 5, 1993, Sun Microsystems announced Windows Application Binary Interface (WABI), a product to run Windows software on Unix, and the Public Windows Interface (PWI) initiative, an effort to standardize a subset of the popular 16-bit Windows APIs.
>In February 1994, the PWI Specification Committee sent a draft specification to X/Open—who rejected it in March, after being threatened by Microsoft's assertion of intellectual property rights (IPR) over the Windows APIs
Yes, and they're both massively debated and criticised, to the point that the industry developed Risk-V in the firstplace. Not to mention the rugpull licensing ARM pulled a few years back.
It ticks a checkbox. That's it. Some organizations and/or governments might have rules that emphasize using international standards, and this might help with it.
I just hope it's going to be a "throw it over the fence and standardize" type of a deal, where the actual standardization process will still be outside of ISO (the ISO process is not very good - not my words, just ask the members of the C++ committee) and the text of the standard will be freely licensed and available to everyone (ISO paywalls its standards).
They're excited about putting the spec behind a notoriously closed paywall??
Us older nerds will remember how Microsoft corrupted the entire ISO standardization process to ram down the Office Open XML (.docx/.xlsx/etc) unto the world.
The original Office ISO standard was 6000+ pages and basically declared unreproducible outside of Microsoft themselves.
There is an entire Wikipedia article dedicated to the kafkaesque byzantine nightmare that was that standardization. [0]
I'd wish they'd write a test suite or certification program instead.. Those ISO standard documents are nowadays better parseable with a chatbot, but they are still the wrong language for the job.
It would be very cool to run the compiled code developed in an ISO/IEC-standardized language on an ISO/IEC-standardized CPU. It might even be standard-compliant.
I don't understand why they want to put the RISC-V spec behind the ISO paywall. It will just complicate the access to the standardized version to confirm compliance with it.
Are there any promising core designs yet? Multi-core designs? Any promising extensions being standardized?
I really want to believe, but I don't think we'll see anything like an M5 chip anytime soon simply because there's so little investment from the bigger players.
Yeah Rivos apparently taped out a high performance server class core (probably only a test chip I'd guess) before Meta bought them.
There are plenty of multi core designs (that's easy) but they aren't very fast.
In terms of open source XiangShan is the most advanced as far as I know. It's fairly high performance out-of-order.
I don't think there's anything M5-level and probably won't be for a while (it took ARM decades so it's not a failing). I doubt we'll see any serious RISC-V laptops because there probably isn't demand (maybe Chromebooks though?). More likely to see phones and servers because Android is supporting RISC-V, and servers run Linux.
In terms of extensions I think it's pretty much all there. Probably it needs some kind of extension to make x86 emulation fast, like Apple did. The biggest extension I know of that isn't ratified is the P packed SIMD one but I don't know if there's much demand for that outside of DSPs.
I wonder why. Marketing? ISO tax mandatory to access some specific markets? That said, they should be careful on what they will pay in order to get an ISO stamp. And what parts of RISC-V will be covered... because RVA may probably get significant changes (after a while it may drop some hardware requirements which are kind of only here to help port from legacy ISA to RISC-V). Not to mention, it seems there are doubts about the core memory reservation over ZACAS and only designers of large and performant RISC-V implementations could answer that, and maybe this is a fluke.
It weirdly feels too early.
ISO is often the source of feature creep in programming languages or massive bloat (mechanically favoring some vendors) in file formats. Namely, everything from ISO must be looked at in the details to see if it is 'clean'.
People with absolutely no technical clue who only know "ISO 9001" equate "ISO" with quality initiatives and certifications.
What people with a better clue sometimes wrongly equate ISO with is interoperability.
ISO standards can help somewhat. If you have ISO RISC V, then you can analyze a piece of code and know, is this strictly ISO RISV code, or is it using vendor extensions.
If an architecture is controlled by a vendor, or a consortium, we still know analogous things: like does the program conform to some version of the ISA document from the vendor/consortium.
That vendor has a lot of power to take it in new directions though without getting anyone else to sign off.
While the sentiment is a bit harsh, the performance gap noted is real. RISC-V has a ways to go to catch up to ARM64 and then finally AMD64 but if the Apple M1 taught us anything, it's possible.
RISC-V shouldn't try to catch 40 years of spiral-development, but rather focus on something people can gather momentum around.
amd64 wasn't a great design, but provided a painless migration path for x86 developers to 64bit. Even Intel adopted this competitors architecture.
I like the company making a multi-core pseudo GPU card around RISC-V + DSP cores, but again copying NVIDIA bodged on mailbox style hardware is a mistake. It is like the world standardized around square-wheels as a latency joke or something... lol
Making low-volume bespoke silicon is a fools errand, and competing with a half-baked product for an established market is a failed company sooner or later.
I think people are confusing what I see with what I would like to see. An open ISA would be great, but at this point I can't even convince myself I'd buy a spool of such chips. =3
I would agree for FPGA soft-cpu the RISC-V is an obvious choice.
But in general, the next question will be which version did you deploy, and which cross-compiler do you use. All the documentation people search will have caveats, or simply form contradictory guidance.
The problem isn't the ISA, but the ill fated trap of trying to hit every use-case (design variant fragmentation.) ARM 6 made the same mistake, and ARM8/9 greatly consolidated around the 64 bit core design.
Indeed, an ISO standard may help narrow the project scope, but I doubt it can save the designs given the behavior some of its proponents have shown. =3
Sure, but what we saw was most software simply disabled the advanced vendor specific features in ARM, and still only compile for stable code around the core IP.
This is an important phenomena committee consensus couldn't reconcile. =3
RISC-V has always been an ivory tower, with a lot of bad decisions they double down on. Not surprised they're rushing towards this outdated stamp of authority too.
No overflow/carry flag impacting safe overflow checking and bignum performance, the whole conditional move history and backpeddling and state of Zicond, the system for describing feature support is needlessly complicated and just a mess for users outside of embedded, the spec is written more like an academic paper than a CPU manual, vector instructions act like they're written for a coprocessor for some reason, bad frame pointer ABI support, etc.
There appears to be an undercurrent of this sort underway where the soaring popularity of RISC-V in markets such as China is politically ripe for some incumbent ISAs to turn US government opinion against RISC-V, from a general uptake PoV or from the PoV of introducing laborious procedural delays in the uptake.
Turning the ISA into an ISO standard helps curb such attempts.
Ethernet, although not directly relevant, is a similar example. You can't lobby the US government to outright ban or generally slow the adoption of Ethernet because it's so much of a universal phenomenon by virtue of it being a standard.
But yeah, the ISO standard doesn't hurt.
Dedicated consortiums like CNCF, USB Implementers Forum, Alliance for Open Media, IETF, etc are more qualified at moving a standard forward, than ISO or government bodies.
> Turning the ISA into an ISO standard helps curb such attempts.
Why do you think that would help? I fail to see how that would help.
It also cements the fact that the technology being standardized is simply too fundamental and likely ubiquitous for folks to worry about it being turned into a strategic weapon.
Taking the previously mentioned ethernet example (not a perfect one I should accentuate again): why bother with blocking it's uptake when it is too fundamentally useful and enabling for a whole bunch of other innovation that builds on top.
Is this real? Or FUD?
>> Is this real? Or FUD?
https://www.washingtontimes.com/news/2025/oct/20/risc-v-dese...
Somebody trying to influence Washington seems to want it shut down.
I'm confused. Isn't RISC-V International itself a trusted international organization? It's hard to see how an organization that standardizes screws and plugs could possibly be qualified to develop ISAs.
They develop a well defined standard, not the technologies mentioned in the standard. So yes, they’re qualified.
The "C++ Standards Committee" is Working Group #21 of Sub Committee #22, of the Joint Technical Committee #1 between ISO and the IEC.
It is completely the wrong shape of organization for this work, a large unwieldy bureaucracy created so that sovereign entities could somehow agree things, this works pretty well for ISO 216 (the A-series paper sizes) and while it isn't very productive for something like ISO 26262 (safety) it can't do much harm. For the deeply technical work of a programming language it's hopeless.
The IETF shows a much better way to develop standards for technology.
The main problem is that ISO is a net negative value-add for standardization. At one point, the ISO editor came back and said "you need to change the standard because you used periods instead of commas for the decimal point, a violation of ISO rules." Small wonder there's muttering about taking C and C++ out of ISO.
Hence the concern for the non-language but still deeply technical RISC-V standardization.
C, Java, Rust, JS, C# do exist
It's certainly a cautionary tale
Compared to ISO, RISC-V International has almost no experience maintaining standards.
Even if you think that’s isn’t valuable, the reality is that there is prestige/trustworthiness associated with an “ISO standard” sticker, similar to how having a “published in prestigious journal J” stickers gives scientific papers prestige/trustworthiness.
you my friend have not delved into the rabbithole that is standardisation organizations.
ISO and IEC goes so far beyond bolts and screws it's frankly dizzying how faar reaching their fingers are in our society.
As for why, the top comment explained it well; There is a movement to block Risk-v adoption in the US for some geopolitical shenanigans. A standardisation with a trusted authority may help.
Seems like this would take away a lot of power from RISC-V International. But I don't know much about this process.
Random example I found at a glance: NIST recommending use of a specific ISO standard in domains not formally covered by a regulatory body: https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.S...
Of course this is a lie. But yes, governments like to claim that.
Is ISO as an organisation imperfect sometimes (as in the docs case) sure?, it’s composed of humans who are generally flawed creatures, is it generally a good solution despite that?, also sure.
They’ve published tens of thousands off standards over 70 plus years that are deeply important to multiple industries so disregarding them because Microsoft co-opted them once 20 odd years ago seems unreasonable to me.
> “International standards have a special status,” says Phil Wennblom, Chair of ISO/IEC JTC 1. “Even though RISC-V is already globally recognized, once something becomes an ISO/IEC standard, it’s even more widely accepted. Countries around the world place strong emphasis on international standards as the basis for their national standards. It’s a significant tailwind when it comes to market access.”
Without these profiles, we are stuck with memorizing a word soup of RV64GCBV_Zicntr_Zihpm_etc all means
You will find fossilized languages all over the place.
However even with profiles there are optional extensions and a lot of undefined behaviour (sometimes deliberately, sometimes because the spec is just not especially well written).
It started with G, later retroactively named RVA20 (with a minor extra extension that nobody ever skipped implementing), then RVA22 and now RVA23. All application processor implementations out there conform to a profile, and so do the relevant Linux distributions.
Of course, in embedded systems where the vendor controls the full stack, the freedom of micromanaging which extensions to implement as well as the freedom to add custom extensions is actual value.
The original architects of the ISA knew what they were doing.
“We’re standards compliant”
If the choice is between an architecture owned, patented and managed by a single company domiciled in a foreign country, versus one which is an international standard and has multiple competing vendors, the latter suddenly seems a lot more attractive.
Price and performance don't matter that much. Governments are a lot less price-sensitive than consumers (and even businesses), they're willing to spend money to achieve their goals.
(It's still a trade-off, because standards also cost community time and effort.)
https://en.wikipedia.org/wiki/Application_Programming_Interf...
It didn't become one, but it did become standardised as ECMA-234:
https://ecma-international.org/publications-and-standards/st...
>In February 1994, the PWI Specification Committee sent a draft specification to X/Open—who rejected it in March, after being threatened by Microsoft's assertion of intellectual property rights (IPR) over the Windows APIs
Looks like that's what it was.
I just hope it's going to be a "throw it over the fence and standardize" type of a deal, where the actual standardization process will still be outside of ISO (the ISO process is not very good - not my words, just ask the members of the C++ committee) and the text of the standard will be freely licensed and available to everyone (ISO paywalls its standards).
Casual reminder that they ousted one of the founders of MPEG for daring to question the patent mess around H.265 (paraphrasing, a lot, of course)
Us older nerds will remember how Microsoft corrupted the entire ISO standardization process to ram down the Office Open XML (.docx/.xlsx/etc) unto the world.
The original Office ISO standard was 6000+ pages and basically declared unreproducible outside of Microsoft themselves.
There is an entire Wikipedia article dedicated to the kafkaesque byzantine nightmare that was that standardization. [0]
ISO def lacks luster, and maybe even relevance.
[O] https://en.wikipedia.org/wiki/Standardization_of_Office_Open...
Formal model: https://github.com/riscv/sail-riscv
I really want to believe, but I don't think we'll see anything like an M5 chip anytime soon simply because there's so little investment from the bigger players.
There are plenty of multi core designs (that's easy) but they aren't very fast.
In terms of open source XiangShan is the most advanced as far as I know. It's fairly high performance out-of-order.
I don't think there's anything M5-level and probably won't be for a while (it took ARM decades so it's not a failing). I doubt we'll see any serious RISC-V laptops because there probably isn't demand (maybe Chromebooks though?). More likely to see phones and servers because Android is supporting RISC-V, and servers run Linux.
In terms of extensions I think it's pretty much all there. Probably it needs some kind of extension to make x86 emulation fast, like Apple did. The biggest extension I know of that isn't ratified is the P packed SIMD one but I don't know if there's much demand for that outside of DSPs.
That's not gonna beat the M5, but it should be similar or better relative to M1, and a huge performance jump for RISC-V.
It weirdly feels too early.
ISO is often the source of feature creep in programming languages or massive bloat (mechanically favoring some vendors) in file formats. Namely, everything from ISO must be looked at in the details to see if it is 'clean'.
What people with a better clue sometimes wrongly equate ISO with is interoperability.
ISO standards can help somewhat. If you have ISO RISC V, then you can analyze a piece of code and know, is this strictly ISO RISV code, or is it using vendor extensions.
If an architecture is controlled by a vendor, or a consortium, we still know analogous things: like does the program conform to some version of the ISA document from the vendor/consortium.
That vendor has a lot of power to take it in new directions though without getting anyone else to sign off.
I doubt it - the ISO standard will still allow custom extensions.
..it was the same mistake that made ARM6 worse/more-complex than modern ARM7/8/9. =3
Have you heard of this C++ thing? :)
The STL was good, but Boost proved a phenomena...
https://en.wikipedia.org/wiki/Second-system_effect
ISO standards are often just a sign Process-people are in control =3
RISC-V is still too green, and fragmented-standards always look like a clown car of liabilities to Business people. =3
amd64 wasn't a great design, but provided a painless migration path for x86 developers to 64bit. Even Intel adopted this competitors architecture.
I like the company making a multi-core pseudo GPU card around RISC-V + DSP cores, but again copying NVIDIA bodged on mailbox style hardware is a mistake. It is like the world standardized around square-wheels as a latency joke or something... lol
Making low-volume bespoke silicon is a fools errand, and competing with a half-baked product for an established market is a failed company sooner or later.
I think people are confusing what I see with what I would like to see. An open ISA would be great, but at this point I can't even convince myself I'd buy a spool of such chips. =3
But in general, the next question will be which version did you deploy, and which cross-compiler do you use. All the documentation people search will have caveats, or simply form contradictory guidance.
The problem isn't the ISA, but the ill fated trap of trying to hit every use-case (design variant fragmentation.) ARM 6 made the same mistake, and ARM8/9 greatly consolidated around the 64 bit core design.
Indeed, an ISO standard may help narrow the project scope, but I doubt it can save the designs given the behavior some of its proponents have shown. =3
In the past if you didn't find something you needed, you'd design your own. Now you just tweak RISC-V.
I mean "12 variants of RISC-V" is actually less fragmentation than "RISC-V and 11 others".
As long as there is a stable core to target, that is all that matters for main stream adoption, and profiles and distros are already there with RVA23.
This is an important phenomena committee consensus couldn't reconcile. =3
https://en.wikipedia.org/wiki/Second-system_effect
100% of a small pie is worth far less than a slice from a large pie. I've met people that made that logical error, and it usually doesn't end well. =3
Could you elaborate?