www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Inline assembler for AArch64 code generator

reply Walter Bright <newshound2 digitalmars.com> writes:
Pete Williamson was interested in a D project, so I suggested writing an inline 
assembler for the DMD AArch64 code generator project.

This evening at the D Coffee Haus he showed his work so far. He used AI to 
generate the code! He's going to polish it up and submit it as a PR.
Nov 24
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 25 November 2025 at 07:45:01 UTC, Walter Bright wrote:
 Pete Williamson was interested in a D project, so I suggested 
 writing an inline assembler for the DMD AArch64 code generator 
 project.

 This evening at the D Coffee Haus he showed his work so far. He 
 used AI to generate the code! He's going to polish it up and 
 submit it as a PR.
AIs are getting better and better at programming. They still make mistakes and everything they write should be checked and tested thoroughly, but they’re much better than they were even two years ago.
Nov 25
next sibling parent reply Sergey <kornburn yandex.ru> writes:
On Tuesday, 25 November 2025 at 12:49:55 UTC, jmh530 wrote:
 AIs are getting better and better at programming. They still 
 make mistakes and everything they write should be checked and 
 tested thoroughly, but they’re much better than they were even 
 two years ago.
Full rewrite of DMD with - Opus 4.5/GPT5/Gemini3 Pro/Grok 5 team - from scratch in pure D (without C) - without old design of single data structure - portable and performant When? :)
Nov 25
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 25 November 2025 at 13:00:59 UTC, Sergey wrote:
 On Tuesday, 25 November 2025 at 12:49:55 UTC, jmh530 wrote:
 AIs are getting better and better at programming. They still 
 make mistakes and everything they write should be checked and 
 tested thoroughly, but they’re much better than they were even 
 two years ago.
Full rewrite of DMD with - Opus 4.5/GPT5/Gemini3 Pro/Grok 5 team - from scratch in pure D (without C) - without old design of single data structure - portable and performant When? :)
Haha, to one shot it? Definitely not yet. Limiting factor is probably size of context right now. If we assume Moore's law-style growth in context limits, [1], then something like this is probably a few years off (just guessing without knowing how much context you would actually need to do this). Experiments should probably start with attempting to modernize smaller pieces of the code base. [1] https://towardsdatascience.com/towards-infinite-llm-context-windows-e099225abaaf/
Nov 25
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/25/2025 5:00 AM, Sergey wrote:
 Full rewrite of DMD with
 - Opus 4.5/GPT5/Gemini3 Pro/Grok 5 team
 - from scratch in pure D (without C)
 - without old design of single data structure
 - portable and performant
 
 When? :)
Well-defined components of it, maybe!
Nov 26
prev sibling next sibling parent reply matheus <matheus gmail.com> writes:
On Tuesday, 25 November 2025 at 12:49:55 UTC, jmh530 wrote:
 On Tuesday, 25 November 2025 at 07:45:01 UTC, Walter Bright 
 wrote:
 ...
... AIs are getting better and better at programming. They still make mistakes and everything they write should be checked and tested thoroughly, but they’re much better than they were even two years ago.
Yes and using data for training without user consent. We will turn from thinkers to reviewers and maybe not even that in the future. by the way I just learned that Coca Cola, a $47 B company created their new holidays commercial through AI, which people pointing out some weird things. It will be interesting do watching DConf with AI presenters in coming years. PS: I'm retiring from this industry, but I feel for the new generation. Matheus.
Nov 25
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 25 November 2025 at 16:05:25 UTC, matheus wrote:
 [snip]

 Yes and using data for training without user consent.
Well if your code has a license that anyone can do whatever they want with your code, then people will do whatever they want with it...
 We will turn from thinkers to reviewers and maybe not even that 
 in the future.
I think things will change, but I don't have a good sense of whether what you say will be true. I think part of what the vibe coding trend has revealed is that there is a huge pent-up demand for the skills that historically were the province of programmers.
 by the way I just learned that Coca Cola, a $47 B company 
 created their new holidays commercial through AI, which people 
 pointing out some weird things.
I think the internet has universally agreed that was AI slop.
 It will be interesting do watching DConf with AI presenters in 
 coming years.
I've done things like ask ChatGPT what D should do to evolve and improve. Answers weren't bad.
Nov 25
parent matheus <matheus gmail.com> writes:
On Tuesday, 25 November 2025 at 18:07:30 UTC, jmh530 wrote:
 On Tuesday, 25 November 2025 at 16:05:25 UTC, matheus wrote:
 [snip]

 Yes and using data for training without user consent.
Well if your code has a license that anyone can do whatever they want with your code, then people will do whatever they want with it... ...
Well I wasn't talking about this kind of code or data. I'm talking about usage of data without consent and knowledge or everything stored on the cloud. There are many articles and even some cases being discussed about this. Anyway we will need to wait to see the outcome of this. Matheus.
Nov 25
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/25/2025 8:05 AM, matheus wrote:
 Yes and using data for training without user consent.
Pete told me he used the DMD instruction generator instr.d as training data for the AI!
Nov 26
parent jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 26 November 2025 at 08:55:36 UTC, Walter Bright 
wrote:
 On 11/25/2025 8:05 AM, matheus wrote:
 Yes and using data for training without user consent.
Pete told me he used the DMD instruction generator instr.d as training data for the AI!
At least with ChatGPT, you can make your own custom GPT (a RAG) by feeding in files and instructions. I don’t think this is the same as training or fine-tuning because the weights don’t change, but it becomes like a knowledge base that the GPT can check first. I built a custom D GPT by feeding in the spec. You could probably take the same approach and feed in the whole of the DMD code base. There are limits, like size and number of files, so you would have to combine multiple files together. I’ve been too busy with other stuff to try.
Nov 26
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/25/2025 4:49 AM, jmh530 wrote:
 AIs are getting better and better at programming. They still make mistakes and 
 everything they write should be checked and tested thoroughly, but they’re
much 
 better than they were even two years ago.
The generated code is straightforward and looks good, but it could be shrunken considerably by coalescing similar code and using tables. But it's a good start.
Nov 26