www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Google's take on memory safety

reply RazvanN <razvan.nitu1305 gmail.com> writes:
I stumbled upon this. Here is the abstract:

"2022 marked the 50th anniversary of memory safety 
vulnerabilities, first reported by Anderson et al. Half a century 
later, we are still dealing with memory safety bugs despite 
substantial investments to improve memory unsafe languages. Like 
others', Google’s data and internal vulnerability research show 
that memory safety bugs are widespread and one of the leading 
causes of vulnerabilities in memory-unsafe codebases. Those 
vulnerabilities endanger end users, our industry, and the broader 
society. At Google, we have decades of experience addressing, at 
scale, large classes of vulnerabilities that were once similarly 
prevalent as memory safety issues. Based on this experience we 
expect that high assurance memory safety can only be achieved via 
a Secure-by-Design approach centered around comprehensive 
adoption of languages with rigorous memory safety guarantees. We 
see no realistic path for an evolution of C++ into a language 
with rigorous memory safety guarantees that include temporal 
safety. As a consequence, we are considering a gradual transition 
of C++ code at Google towards other languages that are memory 
safe. Given the large volume of pre-existing C++, we believe it 
is nonetheless necessary to improve the safety of C++ to the 
extent practicable. We are considering transitioning to a safer 
C++ subset, augmented with hardware security features like MTE."

Here is the full paper: 
https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/70477b1d77462cfffc909ca7d7d46d8f749d5642.pdf

RazvanN
Mar 06
next sibling parent reply Sergey <kornburn yandex.ru> writes:
On Wednesday, 6 March 2024 at 09:19:20 UTC, RazvanN wrote:

 We see no realistic path for an evolution of C++
So the future of humanity is with JVM/Swift/Go/Rust?
Mar 06
parent reply RazvanN <razvan.nitu1305 gmail.com> writes:
On Wednesday, 6 March 2024 at 09:42:22 UTC, Sergey wrote:
 On Wednesday, 6 March 2024 at 09:19:20 UTC, RazvanN wrote:

 We see no realistic path for an evolution of C++
So the future of humanity is with JVM/Swift/Go/Rust?
Well, it might be D if we are able to convince people.
Mar 06
next sibling parent Sergey <kornburn yandex.ru> writes:
On Wednesday, 6 March 2024 at 10:46:48 UTC, RazvanN wrote:
 So the future of humanity is with JVM/Swift/Go/Rust?
Well, it might be D if we are able to convince people.
I think they are considering only popular languages
Mar 06
prev sibling parent reply Dodobird <doseiai gmail.com> writes:
On Wednesday, 6 March 2024 at 10:46:48 UTC, RazvanN wrote:
 On Wednesday, 6 March 2024 at 09:42:22 UTC, Sergey wrote:
 On Wednesday, 6 March 2024 at 09:19:20 UTC, RazvanN wrote:

 We see no realistic path for an evolution of C++
So the future of humanity is with JVM/Swift/Go/Rust?
Well, it might be D if we are able to convince people.
You don't need to convince people. The proof is in the pudding. Instead, a reddit explains much of this ------------------------------- title: Why is D unpopular? https://www.reddit.com/r/d_language/comments/q74bzr/why_is_d_unpopular/ reply to first comment by koczurekk • 2y ago : randomguy4q5b3ty • 4mo ago In that sense, I have always felt that D's features are just too general for their own good. ------------------------- So, D is missing the KISS principle, Keep It Simple, Stupid! Do one thing and one thing well. Trying to be jack of all trades will make an expert in none. So! Where does D have unmet potential to shine, and a pre-emptive strike against the rise of rivals, including Rust, and zombie language Java? My answer: Industrial Robotics If we can get a core rock solid industrial robotics set of libraries, D will sell itself. And the C, C++, Java/C++ and descendants -- can move for 1 thing and 1 thing well, to D. Is D Lang willing to take on such a task? Why or why not. What would need to happen for this?
Mar 17
next sibling parent Dodobird <doseiai gmail.com> writes:
On Monday, 18 March 2024 at 00:43:37 UTC, Dodobird wrote:
 On Wednesday, 6 March 2024 at 10:46:48 UTC, RazvanN wrote:
 On Wednesday, 6 March 2024 at 09:42:22 UTC, Sergey wrote:
 On Wednesday, 6 March 2024 at 09:19:20 UTC, RazvanN wrote:

 We see no realistic path for an evolution of C++
So the future of humanity is with JVM/Swift/Go/Rust?
Well, it might be D if we are able to convince people.
You don't need to convince people. The proof is in the pudding. Instead, a reddit explains much of this ------------------------------- title: Why is D unpopular? https://www.reddit.com/r/d_language/comments/q74bzr/why_is_d_unpopular/ reply to first comment by koczurekk • 2y ago : randomguy4q5b3ty • 4mo ago In that sense, I have always felt that D's features are just too general for their own good. ------------------------- So, D is missing the KISS principle, Keep It Simple, Stupid! Do one thing and one thing well. Trying to be jack of all trades will make an expert in none. So! Where does D have unmet potential to shine, and a pre-emptive strike against the rise of rivals, including Rust, and zombie language Java? My answer: Industrial Robotics If we can get a core rock solid industrial robotics set of libraries, D will sell itself. And the C, C++, Java/C++ and descendants -- can move for 1 thing and 1 thing well, to D. Is D Lang willing to take on such a task? Why or why not. What would need to happen for this?
from linkedIn post: ...Context is King Kong, and Consistency is King Kong's MOTHER! https://media.licdn.com/dms/image/C4D12AQH_t0Tkls1dAg/article-inline_image-shrink_400_744/0/1597073878308?e=1715817600&v=beta&t=iaTKFdJ7itHmP1MxUlwxgxjD2jCiquHbkIKY11I39Mc (focus daniel-san image) from: https://www.linkedin.com/pulse/anatomy-linkedin-post-jeff-young
Mar 17
prev sibling parent electricface <electricface qq.com> writes:
On Monday, 18 March 2024 at 00:43:37 UTC, Dodobird wrote:
 On Wednesday, 6 March 2024 at 10:46:48 UTC, RazvanN wrote:
 [...]
You don't need to convince people. The proof is in the pudding. Instead, a reddit explains much of this ------------------------------- title: Why is D unpopular? https://www.reddit.com/r/d_language/comments/q74bzr/why_is_d_unpopular/ [...]
I think there are two directions for D language to become popular: REPL: Rapid interpretation and execution, similar to Python. AI translation of C language projects to D language projects: For example, translating projects of the level of Gtk.
Mar 17
prev sibling next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
Temporal safety is something I am specifically interested in.

I don't think its going to be one solution, but a group of features that 
will work together.

Isolated, reference counting, locking, atomics will all play a role.

Right now my research needs to go into type state analysis, getting a 
design for that will enable us to support the more interesting logic.

That will also handle nullability of pointers too, so it has a lot of 
benefit.

Lastly, dmd-fe needs to get a major upgrade in its analysis of memory.

We need to be able to track what variables contribute towards the 
assignment (SSA) of another variable. Along with values (new ext.).

What variable contributes towards a function argument and the parameter 
it maps to.

I tried to start writing up some analysis of this at the end of 
semantic3 right before  live (to replace it). But ran into trouble at 
the AST level.

I really need a UML class diagram of the AST along with a way to dump 
the AST as XML (although an object diagram would be nice too).
I haven't tried writing a tool for that (class diagram of AST should 
auto-generate an image into PR), but I expect it'll be the same problem 
wrt. AST understanding.
Mar 06
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Wed, Mar 06, 2024 at 09:19:20AM +0000, RazvanN via Digitalmars-d wrote:
 I stumbled upon this. Here is the abstract:
 
 "2022 marked the 50th anniversary of memory safety vulnerabilities,
 first reported by Anderson et al. Half a century later, we are still
 dealing with memory safety bugs despite substantial investments to
 improve memory unsafe languages. [...] We see no realistic path for an
 evolution of C++ into a language with rigorous memory safety
 guarantees that include temporal safety. As a consequence, we are
 considering a gradual transition of C++ code at Google towards other
 languages that are memory safe.
[...] Called it. Since years ago. It's been clear then, and becoming even clearer now by the day, in today's landscape of bots that constantly scan the net for systems with memory vulnerabilities, that the days of memory-unsafe languages like C or C++ are numbered. Software systems have developed to the point that manually managing memory just doesn't cut it anymore. Systems have become too large, too complex, and memory issues have become intractibly difficult to handle manually. It's time to let go of the illusion of total control over one's memory usage and make use of real solutions. Like the GC. ;-) Or any of the various automated memory management schemes. Also, when is D gonna get safe by default?! Seriously, it's been years. Why are we still stuck at the same old stalemate over something so irritatingly trivial like having extern(C) *not* default to safe?! It should have been a 5-minute decision, yet at the rate things are going it will soon be a 5-year decision. If even that. Let's hope it's not going to be a 50-year decision(!). :-/ T -- Making a boat out of stone would be a hardship.
Mar 06
next sibling parent M. M. <matus email.cz> writes:
On Wednesday, 6 March 2024 at 17:16:24 UTC, H. S. Teoh wrote:
 [...]

 Also, when is D gonna get  safe by default?!  Seriously, it's 
 been years.  Why are we still stuck at the same old stalemate 
 over something so irritatingly trivial like having extern(C) 
 *not* default to  safe?! It should have been a 5-minute 
 decision, yet at the rate things are going it will soon be a 
 5-year decision. If even that. Let's hope it's not going to be 
 a 50-year decision(!). :-/
Yeah, with all these waves that the US government with their call for usage of safe programming languages, it would be nice to have had made safe the default before this call came. Now, with the explicit call to not use C/C++, making extern(C) not safe sounds like a no-brainer to me.
Mar 06
prev sibling next sibling parent reply M. M. <matus email.cz> writes:
On Wednesday, 6 March 2024 at 17:16:24 UTC, H. S. Teoh wrote:
 On Wed, Mar 06, 2024 at 09:19:20AM +0000, RazvanN via 
 Digitalmars-d wrote:
 [...]
[...] Called it. Since years ago. It's been clear then, and becoming even clearer now by the day, in today's landscape of bots that constantly scan the net for systems with memory vulnerabilities, that the days of memory-unsafe languages like C or C++ are numbered. Software systems have developed to the point that manually managing memory just doesn't cut it anymore. Systems have become too large, too complex, and memory issues have become intractibly difficult to handle manually. It's time to let go of the illusion of total control over one's memory usage and make use of real solutions. Like the GC. ;-) Or any of the various automated memory management schemes. [...]
I wonder what Linus is thinking about all this bashing of C...
Mar 06
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Wed, Mar 06, 2024 at 06:51:47PM +0000, M. M. via Digitalmars-d wrote:
 On Wednesday, 6 March 2024 at 17:16:24 UTC, H. S. Teoh wrote:
 On Wed, Mar 06, 2024 at 09:19:20AM +0000, RazvanN via Digitalmars-d
 wrote:
[...]
 Called it. Since years ago. It's been clear then, and becoming even
 clearer now by the day, in today's landscape of bots that constantly
 scan the net for systems with memory vulnerabilities, that the days
 of memory-unsafe languages like C or C++ are numbered.  Software
 systems have developed to the point that manually managing memory
 just doesn't cut it anymore.  Systems have become too large, too
 complex, and memory issues have become intractibly difficult to
 handle manually.  It's time to let go of the illusion of total
 control over one's memory usage and make use of real solutions.
 Like the GC. ;-)  Or any of the various automated memory management
 schemes.
[...]
 I wonder what Linus is thinking about all this bashing of C...
Linus, being Linus, probably doesn't care. :-D Well, he'd probably write a diatribe about why they're totally wrong, and then he'll just ignore them and keep doing whatever he's doing. None of that stops the inevitable, though. Memory-unsafe languages are on their way out. It may take another 20 years, or it may take 50 years, but make no mistake, their demise will come. That much is sure. T -- Not all rumours are as misleading as this one.
Mar 06
next sibling parent reply Sergey <kornburn yandex.ru> writes:
On Wednesday, 6 March 2024 at 19:13:26 UTC, H. S. Teoh wrote:
 languages are on their way out. It may take another 20 years, 
 or it may take 50 years, but make no mistake, their demise will
Some CEOs expecting in 5 years nobody will need programming because of AI :) And AI will be banned to use “unsafe” code :) https://m.youtube.com/watch?v=r2npdV6tX1g
Mar 06
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Wed, Mar 06, 2024 at 09:16:23PM +0000, Sergey via Digitalmars-d wrote:
 On Wednesday, 6 March 2024 at 19:13:26 UTC, H. S. Teoh wrote:
 languages are on their way out. It may take another 20 years, or it
 may take 50 years, but make no mistake, their demise will
Some CEOs expecting in 5 years nobody will need programming because of AI :) And AI will be banned to use “unsafe” code :)
[...] What people are calling "AI" these days is nothing but a glorified interpolation algorithm, boosted by having access to an internet's load of data it can interpolate from to give it a superficial semblance of "intelligence". The algorithm is literally unable to produce correct code besides that which has already been written (and published online) by someone else. Ask it to write code that has an existing, correct implementation, and you have a chance of getting correct, working code. Ask it to write something that has never been written before... I'd really look into taking up life insurance before putting the resulting code in production. T -- Unix was not designed to stop people from doing stupid things, because that would also stop them from doing clever things. -- Doug Gwyn
Mar 06
next sibling parent reply aberba <karabutaworld gmail.com> writes:
On Wednesday, 6 March 2024 at 21:34:45 UTC, H. S. Teoh wrote:
 On Wed, Mar 06, 2024 at 09:16:23PM +0000, Sergey via 
 Digitalmars-d wrote:
 On Wednesday, 6 March 2024 at 19:13:26 UTC, H. S. Teoh wrote:
 languages are on their way out. It may take another 20 
 years, or it may take 50 years, but make no mistake, their 
 demise will
Some CEOs expecting in 5 years nobody will need programming because of AI :) And AI will be banned to use “unsafe” code :)
[...] What people are calling "AI" these days is nothing but a glorified interpolation algorithm, boosted by having access to an internet's load of data it can interpolate from to give it a superficial semblance of "intelligence". The algorithm is literally unable to produce correct code besides that which has already been written (and published online) by someone else. Ask it to write code that has an existing, correct implementation, and you have a chance of getting correct, working code. Ask it to write something that has never been written before... I'd really look into taking up life insurance before putting the resulting code in production. T
Well, that's pretty much what the idea of training an AI model is. You train a model based on existing data to learn from it. AIs aren't able to come up with original ideas. However, it should be noted that AI does a "bit" better at concocting results. Just not original ideas (very same applies humans most times actually, but with self+context awareness). Also depending on the task, the results can be better. For example it appears natural language processing has progressed better than other AI tasks/fields. For sure, there's too much hype, but that's just to bring in VC investment. My observation.
Mar 07
parent Dom DiSc <dominikus scherkl.de> writes:
On Thursday, 7 March 2024 at 08:12:36 UTC, aberba wrote:
 Also depending on the task, the results can be better. For 
 example it appears natural language processing has progressed 
 better than other AI tasks/fields.
That's only because for natural language there is better training data, because humans are good at natural language. (Still AI is not as good at it as humans, only not so bad anymore that one shruggs in disapointment). But in programming languages humans are not as good, so the training data also is not. I have no high hopes in using such an AI.
Mar 07
prev sibling parent DrDread <DrDread cheese.com> writes:
On Wednesday, 6 March 2024 at 21:34:45 UTC, H. S. Teoh wrote:
 On Wed, Mar 06, 2024 at 09:16:23PM +0000, Sergey via 
 Digitalmars-d wrote:
 On Wednesday, 6 March 2024 at 19:13:26 UTC, H. S. Teoh wrote:
 languages are on their way out. It may take another 20 
 years, or it may take 50 years, but make no mistake, their 
 demise will
Some CEOs expecting in 5 years nobody will need programming because of AI :) And AI will be banned to use “unsafe” code :)
[...] What people are calling "AI" these days is nothing but a glorified interpolation algorithm, boosted by having access to an internet's load of data it can interpolate from to give it a superficial semblance of "intelligence". The algorithm is literally unable to produce correct code besides that which has already been written (and published online) by someone else. Ask it to write code that has an existing, correct implementation, and you have a chance of getting correct, working code. Ask it to write something that has never been written before... I'd really look into taking up life insurance before putting the resulting code in production. T
In my experience, it produces better results than a lot of programmers. and does so in a seconds instead of months. and it's really not just interpolation either. but obviously it's still far from perfect. but it's become a useful tool
Mar 07
prev sibling parent reply matheus <matheus gmail.com> writes:
On Wednesday, 6 March 2024 at 19:13:26 UTC, H. S. Teoh wrote:
 ...
I think you're being too harsh with Linus Torvalds, back in the day I really think he was right regarding C vs C++ battle. A couple of years ago he even agreed with some Rust experiments in the Kernel and pointed out some problems later too. He is a technical person, he usually point out flaws very correctly. Matheus.
Mar 06
parent reply Martyn <martyn.developer googlemail.com> writes:
On Wednesday, 6 March 2024 at 21:34:27 UTC, matheus wrote:
 On Wednesday, 6 March 2024 at 19:13:26 UTC, H. S. Teoh wrote:
 ...
I think you're being too harsh with Linus Torvalds, back in the day I really think he was right regarding C vs C++ battle. A couple of years ago he even agreed with some Rust experiments in the Kernel and pointed out some problems later too. He is a technical person, he usually point out flaws very correctly. Matheus.
On top of this, Linus knows that he needs to accept the ever changing industry. He wont be a maintainer forever. He has to pass the torch at some point. Younger developers coming in are less likely "C experts" or have interest doing so when we have languages like Rust. I am sure this factors in (with other reasons) why the kernel needs to remain relevant for the new generation. The number of C gurus will shrink in the next 20 years, which is likely to align with the transition in the Linux kernel to another language. Rust is likely to gain more responsibility in the kernel at this point, and unlikely to be any other language (in my opinion) From what I have seen, Linus seems pretty level headed towards Rust. I dont think he has any interest learning it. Behind the curtains he might hate the language as much as he does C++. It is moving with the times.
Mar 11
parent reply dweldon <danny.weldon gmail.com> writes:
On Monday, 11 March 2024 at 09:41:16 UTC, Martyn wrote:
 From what I have seen, Linus seems pretty level headed towards 
 Rust. I dont think he has any interest learning it. Behind the 
 curtains he might hate the language as much as he does C++. It 
 is moving with the times.
Rust is already in the Linux kernel: https://thenewstack.io/rust-in-the-linux-kernel/ https://docs.kernel.org/rust/index.html
Mar 12
parent Martyn <martyn.developer googlemail.com> writes:
On Tuesday, 12 March 2024 at 08:29:08 UTC, dweldon wrote:
 On Monday, 11 March 2024 at 09:41:16 UTC, Martyn wrote:
 From what I have seen, Linus seems pretty level headed towards 
 Rust. I dont think he has any interest learning it. Behind the 
 curtains he might hate the language as much as he does C++. It 
 is moving with the times.
Rust is already in the Linux kernel: https://thenewstack.io/rust-in-the-linux-kernel/ https://docs.kernel.org/rust/index.html
Yes. I know most on this forum knows this, which is why I did not state it.
Mar 12
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 3/6/2024 9:16 AM, H. S. Teoh wrote:
 Called it.
I announced at the Berlin DConf that C was doomed because of memory unsafety!
Mar 07
parent reply Monkyyy <crazymonkyyy gmail.com> writes:
On Friday, 8 March 2024 at 04:50:38 UTC, Walter Bright wrote:
 On 3/6/2024 9:16 AM, H. S. Teoh wrote:
 Called it.
I announced at the Berlin DConf that C was doomed because of memory unsafety!
C is still here, C will continue to be here until something else drastically makes it worth while to redo the gaint stack of code that is gnu and the linux kernal which dispite many many delusional proclamations, errrr no I'm not swapping to a rust os anytime in the near future. *Why are you proud of a wrong prediction*, being remade by loud corps who will also be wrong. https://www.tiobe.com/tiobe-index/ Python and java, and js are not exactly safe languges, there no way to inturpt the high ranking as being coherently designed around safety. Headlines and noise that poeple reshare to feel smart but do not change thier behavior, are not reality.
Mar 08
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 8 March 2024 at 09:09:12 UTC, Monkyyy wrote:
 On Friday, 8 March 2024 at 04:50:38 UTC, Walter Bright wrote:
 On 3/6/2024 9:16 AM, H. S. Teoh wrote:
 Called it.
I announced at the Berlin DConf that C was doomed because of memory unsafety!
C is still here, C will continue to be here until something else drastically makes it worth while to redo the gaint stack of code that is gnu and the linux kernal which dispite many many delusional proclamations, errrr no I'm not swapping to a rust os anytime in the near future. *Why are you proud of a wrong prediction*, being remade by loud corps who will also be wrong. https://www.tiobe.com/tiobe-index/ Python and java, and js are not exactly safe languges, there no way to inturpt the high ranking as being coherently designed around safety. Headlines and noise that poeple reshare to feel smart but do not change thier behavior, are not reality.
"The US government says it would be better for them if you ceased using C or C++ when programming tools. In a recent report, the White House Office of the National Cyber Director (ONCD) has urged developers to utilize “memory-safe programming languages,” a classification that does not include widely used languages. The recommendation is a step toward “securing the building blocks of cyberspace” and is a component of US President Biden’s cybersecurity plan." https://readwrite.com/the-nsa-list-of-memory-safe-programming-languages-has-been-updated/ The issue at hand in safety from memory corruption, not 100% safety. Just wait until delivering software written in C or C++ requires a biohazard symbol "handle with care" kind of regulation, and insurance companies high premiums on software developed with such languages.
Mar 08
next sibling parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Friday, 8 March 2024 at 10:43:39 UTC, Paulo Pinto wrote:
 
 Just wait until delivering software written in C or C++ 
 requires a biohazard symbol "handle with care" kind of 
 regulation
Why would this happen? Is there a reason to expect it to happen? How do you square this with mircosoft being a friend of the cia, and mircosoft also has a gaint pile of c and c++ its up for debate if they are capable of rewriting
Mar 08
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 8 March 2024 at 17:14:00 UTC, monkyyy wrote:
 On Friday, 8 March 2024 at 10:43:39 UTC, Paulo Pinto wrote:
 
 Just wait until delivering software written in C or C++ 
 requires a biohazard symbol "handle with care" kind of 
 regulation
Why would this happen? Is there a reason to expect it to happen? How do you square this with mircosoft being a friend of the cia, and mircosoft also has a gaint pile of c and c++ its up for debate if they are capable of rewriting
Yes, that is the expected outcome of US and EU cybersecurity bills that make companies liable for security exploits. Rust is now the official systems programming language for Azure infrastructure teams, alongside managed languages. Use of C and C++ is constrained to existing code bases. Rust is also shipping on Windows kernel already.
Mar 08
parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Fri, Mar 08, 2024 at 06:59:28PM +0000, Paulo Pinto via Digitalmars-d wrote:
[...]
 Rust is now the official systems programming language for Azure
 infrastructure teams, alongside managed languages. Use of C and C++ is
 constrained to existing code bases.
 
 Rust is also shipping on Windows kernel  already.
Like I said, the inevitable has begun. It's just a matter of time now. The clock is already ticking for memory-unsafe languages. It may not be as fast as some may think/want, but that day undoubtedly will come when these languages will fade away in the rearview mirror. T -- Старый друг лучше новых двух.
Mar 08
prev sibling parent reply Gregor =?UTF-8?B?TcO8Y2ts?= <gregormueckl gmx.de> writes:
On Friday, 8 March 2024 at 10:43:39 UTC, Paulo Pinto wrote:
 Just wait until delivering software written in C or C++ 
 requires a biohazard symbol "handle with care" kind of 
 regulation, and insurance companies high premiums on software 
 developed with such languages.
This isn't going to happen in this century. You're talking about an absolutely *gigantic* amount of software - an utterly, unfathomably, big amount. Many thousand lifetimes' worth of work. A quick estimate tells me that my computer is running several *hundred* million lines of code just for firmware, OS, drivers, shell/GUI, browser etc. so that I can write this message. Mandating a rewrite of all of that is both a fool's errand and economic suicide for whatever nation that wants to enforce such a mandate. To my knowledge, the last major OS kernel that was started from scratch was Linux (I believe that the roots of the current MacOS/iOS/visionOS... kernel are actually older and NT certainly is). No newer kernel has reached a similar level of maturity. All major browsers that are currently in use have their roots in the 90s or early 2000s. It's quite easy to continue this list with all kinds of application software and stuff. It would be a major miracle if even a single one of these chunks of software would get replaced by a rewrite from scratch within the next one or two decades. Replacing all of them at once is so much effort that it would mean complete industry-wide stagnation for decades. The best that can happen is a glacially slow migration of single components to other languages that are (perceived to be) more modern. At worst, we end up with a stack that stays the way it is and gets another layer of glossy paint poured over it. Given the state of our industry, that's the more likely outcome. Any attempt to politically enforce anything more radical than that will be met with enormous and vicious resistance from companies, which I would expect to be successful.
Mar 08
next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Saturday, 9 March 2024 at 02:33:16 UTC, Gregor Mückl wrote:
 On Friday, 8 March 2024 at 10:43:39 UTC, Paulo Pinto wrote:
 [...]
This isn't going to happen in this century. You're talking about an absolutely *gigantic* amount of software - an utterly, unfathomably, big amount. Many thousand lifetimes' worth of work. [...]
I've first hand informations about the insurances that one big consultancy company (I mean, one among Accenture, PWC, Deloitte, NTT, etc) is paying for cybersecurity risk, and the amount is simply skyrocketing. And there's also a big reputational risk for their customers.
Mar 09
prev sibling next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 9 March 2024 at 02:33:16 UTC, Gregor Mückl wrote:
 On Friday, 8 March 2024 at 10:43:39 UTC, Paulo Pinto wrote:
 [...]
This isn't going to happen in this century. You're talking about an absolutely *gigantic* amount of software - an utterly, unfathomably, big amount. Many thousand lifetimes' worth of work. [...]
This is already happening, in Germany if a company is proven to not have provided adequate security measures, customers are allowed to take them to court as per latest cybersecurity laws. Additionally in consulting, security fixes have to be provided free of charge by the delivery company when proven liable.
Mar 09
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 3/8/2024 6:33 PM, Gregor Mückl wrote:
 You're talking about an absolutely *gigantic* amount of software - an utterly, 
 unfathomably, big amount. Many thousand lifetimes' worth of work.
That's why D enables hybrid C/D programs. D can be gradually introduced into an existing C code base.
Mar 09
parent reply Lance Bachmeier <no spam.net> writes:
On Saturday, 9 March 2024 at 20:18:41 UTC, Walter Bright wrote:
 On 3/8/2024 6:33 PM, Gregor Mückl wrote:
 You're talking about an absolutely *gigantic* amount of 
 software - an utterly, unfathomably, big amount. Many thousand 
 lifetimes' worth of work.
That's why D enables hybrid C/D programs. D can be gradually introduced into an existing C code base.
I "ported" a few thousand lines of C to D in a couple hours this afternoon. That includes the time it took to put all C memory allocation inside SafeRefCounted. With the overhead out of the way (setting up the SafeRefCounted structs, testing, and some minor other things) I bet I could easily port 20,000 lines in an 8-hour day. Working directly with C macros was the last thing needed to make this go fast.
Mar 09
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple hours this afternoon.
That 
 includes the time it took to put all C memory allocation inside
SafeRefCounted. 
 With the overhead out of the way (setting up the SafeRefCounted structs, 
 testing, and some minor other things) I bet I could easily port 20,000 lines
in 
 an 8-hour day. Working directly with C macros was the last thing needed to
make 
 this go fast.
Thanks for posting that, I enjoy such testimonials!
Mar 09
next sibling parent reply Emmanuel Danso Nyarko <emmankoko519 gmail.com> writes:
On Sunday, 10 March 2024 at 04:24:15 UTC, Walter Bright wrote:
 On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple hours 
 this afternoon. That includes the time it took to put all C 
 memory allocation inside SafeRefCounted. With the overhead out 
 of the way (setting up the SafeRefCounted structs, testing, 
 and some minor other things) I bet I could easily port 20,000 
 lines in an 8-hour day. Working directly with C macros was the 
 last thing needed to make this go fast.
Thanks for posting that, I enjoy such testimonials!
What about we build maybe a strategy to send D out there! We must let the world see the power of D.
Mar 10
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 3/10/2024 1:34 AM, Emmanuel Danso Nyarko wrote:
 What about we build maybe a strategy to send D out there! We must let the
world 
 see the power of D.
Want to take point on that?
Mar 10
prev sibling parent reply bachmeier <no spam.net> writes:
On Sunday, 10 March 2024 at 09:34:28 UTC, Emmanuel Danso Nyarko 
wrote:
 On Sunday, 10 March 2024 at 04:24:15 UTC, Walter Bright wrote:
 On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple hours 
 this afternoon. That includes the time it took to put all C 
 memory allocation inside SafeRefCounted. With the overhead 
 out of the way (setting up the SafeRefCounted structs, 
 testing, and some minor other things) I bet I could easily 
 port 20,000 lines in an 8-hour day. Working directly with C 
 macros was the last thing needed to make this go fast.
Thanks for posting that, I enjoy such testimonials!
What about we build maybe a strategy to send D out there! We must let the world see the power of D.
Show them working code. This is a separate project I did after the one I posted about in my previous comment. https://github.com/bachmeil/d-gslrng It took only a few hours and there's over 7000 lines of C. There are some nice features of this project: - I made zero changes to the C code. Now that we have macro support, every line in the C files was copied and pasted. That means I get to reuse the decades of testing done on this popular library. - I was able to strip out a small part of a much larger library. If I were calling into a C library, I'd be stuck with whatever they give me. - There's no shared library dependency. That means support for every OS out of the box. No bindings or wrappers. Aside from the previous comment about stripping out most of the library, I can make changes to the functions if I want. With a shared library you either use what they give you or you maintain your own fork in order to share your work with others.
Mar 13
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 13 March 2024 at 19:18:20 UTC, bachmeier wrote:
 On Sunday, 10 March 2024 at 09:34:28 UTC, Emmanuel Danso Nyarko 
 wrote:
 On Sunday, 10 March 2024 at 04:24:15 UTC, Walter Bright wrote:
 On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple hours 
 this afternoon. That includes the time it took to put all C 
 memory allocation inside SafeRefCounted. With the overhead 
 out of the way (setting up the SafeRefCounted structs, 
 testing, and some minor other things) I bet I could easily 
 port 20,000 lines in an 8-hour day. Working directly with C 
 macros was the last thing needed to make this go fast.
Thanks for posting that, I enjoy such testimonials!
What about we build maybe a strategy to send D out there! We must let the world see the power of D.
Show them working code. This is a separate project I did after the one I posted about in my previous comment. https://github.com/bachmeil/d-gslrng It took only a few hours and there's over 7000 lines of C. There are some nice features of this project: - I made zero changes to the C code. Now that we have macro support, every line in the C files was copied and pasted. That means I get to reuse the decades of testing done on this popular library. - I was able to strip out a small part of a much larger library. If I were calling into a C library, I'd be stuck with whatever they give me.
[snip] Cool! So this proves that importC works on a limited subset of gsl. Do you have any reason to believe it wouldn’t work on the whole library?
Mar 13
parent reply bachmeier <no spam.net> writes:
On Wednesday, 13 March 2024 at 20:49:01 UTC, jmh530 wrote:
 On Wednesday, 13 March 2024 at 19:18:20 UTC, bachmeier wrote:
 On Sunday, 10 March 2024 at 09:34:28 UTC, Emmanuel Danso 
 Nyarko wrote:
 On Sunday, 10 March 2024 at 04:24:15 UTC, Walter Bright wrote:
 On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple hours 
 this afternoon. That includes the time it took to put all C 
 memory allocation inside SafeRefCounted. With the overhead 
 out of the way (setting up the SafeRefCounted structs, 
 testing, and some minor other things) I bet I could easily 
 port 20,000 lines in an 8-hour day. Working directly with C 
 macros was the last thing needed to make this go fast.
Thanks for posting that, I enjoy such testimonials!
What about we build maybe a strategy to send D out there! We must let the world see the power of D.
Show them working code. This is a separate project I did after the one I posted about in my previous comment. https://github.com/bachmeil/d-gslrng It took only a few hours and there's over 7000 lines of C. There are some nice features of this project: - I made zero changes to the C code. Now that we have macro support, every line in the C files was copied and pasted. That means I get to reuse the decades of testing done on this popular library. - I was able to strip out a small part of a much larger library. If I were calling into a C library, I'd be stuck with whatever they give me.
[snip] Cool! So this proves that importC works on a limited subset of gsl. Do you have any reason to believe it wouldn’t work on the whole library?
I'd be surprised if there's anything it can't compile. I've compiled lots of other parts, but split this one out because I don't want to add hundreds of unnecessary files if all I want to do is generate random draws in parallel.
Mar 13
parent Dodobird <doseiai gmail.com> writes:
On Wednesday, 13 March 2024 at 21:13:31 UTC, bachmeier wrote:
 On Wednesday, 13 March 2024 at 20:49:01 UTC, jmh530 wrote:
 On Wednesday, 13 March 2024 at 19:18:20 UTC, bachmeier wrote:
 On Sunday, 10 March 2024 at 09:34:28 UTC, Emmanuel Danso 
 Nyarko wrote:
 On Sunday, 10 March 2024 at 04:24:15 UTC, Walter Bright 
 wrote:
 On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple 
 hours this afternoon. That includes the time it took to 
 put all C memory allocation inside SafeRefCounted. With 
 the overhead out of the way (setting up the SafeRefCounted 
 structs, testing, and some minor other things) I bet I 
 could easily port 20,000 lines in an 8-hour day. Working 
 directly with C macros was the last thing needed to make 
 this go fast.
Thanks for posting that, I enjoy such testimonials!
What about we build maybe a strategy to send D out there! We must let the world see the power of D.
Show them working code. This is a separate project I did after the one I posted about in my previous comment. https://github.com/bachmeil/d-gslrng It took only a few hours and there's over 7000 lines of C. There are some nice features of this project: - I made zero changes to the C code. Now that we have macro support, every line in the C files was copied and pasted. That means I get to reuse the decades of testing done on this popular library. - I was able to strip out a small part of a much larger library. If I were calling into a C library, I'd be stuck with whatever they give me.
[snip] Cool! So this proves that importC works on a limited subset of gsl. Do you have any reason to believe it wouldn’t work on the whole library?
I'd be surprised if there's anything it can't compile. I've compiled lots of other parts, but split this one out because I don't want to add hundreds of unnecessary files if all I want to do is generate random draws in parallel.
Yes, a strategy. Without it business will say 不靠谱 (boo cow poo, aka unreliable) and flee. You never wanna hear that from customers, and when you do, bust your ass to build trust that was lost. If you don't know where you are going, and road will get you there. (lyrics) But she wants everything (He can pretend to give her everything) Or there's nothing she wants (She don't want to sort it out) He's crazy for this girl (But she don't know what she's looking for) If she knew what she wants He'd be giving it to her Giving it to her I'd say her values are corrupted But she's open to change Then one day she's satisfied And the next I'll find her crying And it's nothing she can explain ------ If She Knew What She Wants Song by The Bangles 1986 https://www.youtube.com/watch?v=mu_pNeqAQ-U
Mar 17
prev sibling parent reply harakim <harakim gmail.com> writes:
On Wednesday, 13 March 2024 at 19:18:20 UTC, bachmeier wrote:
 On Sunday, 10 March 2024 at 09:34:28 UTC, Emmanuel Danso Nyarko 
 wrote:
 On Sunday, 10 March 2024 at 04:24:15 UTC, Walter Bright wrote:
 On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple hours 
 this afternoon. That includes the time it took to put all C 
 memory allocation inside SafeRefCounted. With the overhead 
 out of the way (setting up the SafeRefCounted structs, 
 testing, and some minor other things) I bet I could easily 
 port 20,000 lines in an 8-hour day. Working directly with C 
 macros was the last thing needed to make this go fast.
Thanks for posting that, I enjoy such testimonials!
What about we build maybe a strategy to send D out there! We must let the world see the power of D.
Show them working code. This is a separate project I did after the one I posted about in my previous comment. https://github.com/bachmeil/d-gslrng It took only a few hours and there's over 7000 lines of C. There are some nice features of this project: - I made zero changes to the C code. Now that we have macro support, every line in the C files was copied and pasted. That means I get to reuse the decades of testing done on this popular library. - I was able to strip out a small part of a much larger library. If I were calling into a C library, I'd be stuck with whatever they give me. - There's no shared library dependency. That means support for every OS out of the box. No bindings or wrappers. Aside from the previous comment about stripping out most of the library, I can make changes to the functions if I want. With a shared library you either use what they give you or you maintain your own fork in order to share your work with others.
I will admit that I didn't know what ImportC was all this time. I started gleaning it must be something where you can easily port C code and then thought it sounded like you could just use C code in D. That seemed like a great goal, but obviously that couldn't be it because D would be a lot more popular. Now you say this is true!? This is such an enormous benefit! This should be a huge driver for D. With this functionality, I could easily see it being the language of the year on tiobe, for example. Good work D team! I hope you can follow it up with some success in adoption.
Mar 16
parent Lance Bachmeier <no spam.net> writes:
On Saturday, 16 March 2024 at 18:47:40 UTC, harakim wrote:

 That seemed like a great goal, but obviously that couldn't be 
 it because D would be a lot more popular. Now you say this is 
 true!?
The thing is, it is only very recently that it works well enough to be a practical way to incorporate C code into your project. Initially it just compiled C code. The only problem is that there's very little C code in the wild. It's a mix of C, the preprocessor, and compiler extensions. Over time, that was addressed, but the one remaining (big) limitation was that it couldn't deal with function-like macros. Walter recently remedied that, so for the first time you can grab a 10,000-line .c file and have an expectation that it'll compile cleanly. The only issue I've had is a minor bug that has an easy workaround. Other than that, tens of thousands of lines of C code has compiled for me. Note that right now you need to use the DMD release candidate for 2.108.
 This is such an enormous benefit! This should be a huge driver 
 for D. With this functionality, I could easily see it being the 
 language of the year on tiobe, for example.
It's hard to predict usage, but it's a darn good tool for working with a legacy C codebase. With module support, you can even add import statements to your C files to import D functions. I've used that in a couple cases to eliminate dependencies.
Mar 16
prev sibling parent reply Martyn <martyn.developer googlemail.com> writes:
On Sunday, 10 March 2024 at 04:24:15 UTC, Walter Bright wrote:
 On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple hours 
 this afternoon. That includes the time it took to put all C 
 memory allocation inside SafeRefCounted. With the overhead out 
 of the way (setting up the SafeRefCounted structs, testing, 
 and some minor other things) I bet I could easily port 20,000 
 lines in an 8-hour day. Working directly with C macros was the 
 last thing needed to make this go fast.
Thanks for posting that, I enjoy such testimonials!
+1 Very impressive!
Mar 11
parent =?UTF-8?B?R2VybcOhbg==?= Diago <germandiago gmail.com> writes:
On Monday, 11 March 2024 at 09:55:04 UTC, Martyn wrote:
 On Sunday, 10 March 2024 at 04:24:15 UTC, Walter Bright wrote:
 On 3/9/2024 3:32 PM, Lance Bachmeier wrote:
 I "ported" a few thousand lines of C to D in a couple hours 
 this afternoon. That includes the time it took to put all C 
 memory allocation inside SafeRefCounted. With the overhead 
 out of the way (setting up the SafeRefCounted structs, 
 testing, and some minor other things) I bet I could easily 
 port 20,000 lines in an 8-hour day. Working directly with C 
 macros was the last thing needed to make this go fast.
Thanks for posting that, I enjoy such testimonials!
+1 Very impressive!
+1 also. Quite a useful use case.
Mar 13
prev sibling parent Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 9 March 2024 at 02:33:16 UTC, Gregor Mückl wrote:
 This isn't going to happen in this century.

 You're talking about an absolutely *gigantic* amount of 
 software - an utterly, unfathomably, big amount. Many thousand 
 lifetimes' worth of work.

 It would be a major miracle if even a single one of these 
 chunks of software would get replaced by a rewrite from scratch 
 within the next one or two decades.
There's a hidden assumption that this task has to be accomplished the way that software was originally written, and not largely automated. ImportC is an illustration of a strong beginning of such, so that C can gradually be D'ified and, en route, made safer. Moving C to D largely automatically is looking like a real prospect now, making the impossible complete rewrite into something quite different. If D reaches a place where human assisted automatic translation of C source to D source is mostly just automatic, then it becomes a strong contender to solve a significant fraction of the problem by an unexpected route. As I see it, ImportC is a hint about the future.
Mar 10
prev sibling parent reply Dukc <ajieskola gmail.com> writes:
On Friday, 8 March 2024 at 09:09:12 UTC, Monkyyy wrote:
 Python and java, and js are not exactly safe languges, there no 
 way to inturpt the high ranking as being coherently designed 
 around safety.
They are safe languages, as far as the common definition goes. What they lack compared to likes of D, Rust or Nim enable is the ability to forgo the GC and allocate memory / cast types manually when you have to. Well, there usually is some way to do that even in those languages if you're determined enough, but it tends to be much harder than in a purpose-built systems programming language. Plus, in all likelihood the low-level controls are completely implementation-specific, as opposed to standard part of the language. C and C++ are the opposite: you can go low-level easily enough, but they don't have a standard safe subset of the language.
Mar 08
parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Friday, 8 March 2024 at 14:16:14 UTC, Dukc wrote:
 On Friday, 8 March 2024 at 09:09:12 UTC, Monkyyy wrote:
 Python and java, and js are not exactly safe languges, there 
 no way to inturpt the high ranking as being coherently 
 designed around safety.
They are safe languages, as far as the common definition goes
https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=javascript "5146 CVE Records" https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=c "1748 CVE Records" Python and js have more mechanisms to run giant stacks of insane code of depenency hell and that insane to call safe; I'm pretty sure js is the main cause of malware spreading, but if it isn't it's up there.
Mar 08
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 8 March 2024 at 17:20:22 UTC, monkyyy wrote:
 On Friday, 8 March 2024 at 14:16:14 UTC, Dukc wrote:
 On Friday, 8 March 2024 at 09:09:12 UTC, Monkyyy wrote:
 Python and java, and js are not exactly safe languges, there 
 no way to inturpt the high ranking as being coherently 
 designed around safety.
They are safe languages, as far as the common definition goes
https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=javascript "5146 CVE Records" https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=c "1748 CVE Records" Python and js have more mechanisms to run giant stacks of insane code of depenency hell and that insane to call safe; I'm pretty sure js is the main cause of malware spreading, but if it isn't it's up there.
Those exploits fail under the 30% attack surface, after we get rid of the 70% caused by memory corruption exploits.
Mar 08
parent monkyyy <crazymonkyyy gmail.com> writes:
On Friday, 8 March 2024 at 19:01:06 UTC, Paulo Pinto wrote:
 Those exploits fail under the 30% attack surface, after we get 
 rid of the 70% caused by memory corruption exploits.
how is 5k vs 1.7k turn into 30%?
Mar 08
prev sibling next sibling parent Monkyyy <crazymonkyyy gmail.com> writes:
On Wednesday, 6 March 2024 at 09:19:20 UTC, RazvanN wrote:
 Here is the full paper: 
 https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/70477b1d77462cfffc909ca7d7d46d8f749d5642.pdf

 RazvanN
*scrolls down to conclusion*
 jvm, go, rust, carbon, vague notion of safe c++
 2 of the three are googles, one is the safty meme, jvm and 
 magic c++ ain't spefic products
This is just an ad for googles languages that's piggybacking off rusts memes
Mar 07
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
C could become far more memory safe it it adopted slices:

https://www.digitalmars.com/articles/C-biggest-mistake.html
Mar 07
parent Marconi <soldate gmail.com> writes:
On Friday, 8 March 2024 at 06:29:17 UTC, Walter Bright wrote:
 C could become far more memory safe it it adopted slices:

 https://www.digitalmars.com/articles/C-biggest-mistake.html
D biggest mistake is NOT to be a better C++.
Mar 08