www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - better video rendering in d

reply monkyyy <crazymonkyyy gmail.com> writes:
My current method of making videos of using raylib to generate 
screenshots, throwing those screenshots into a folder and calling 
a magic ffmpeg command is ... slow.

Does anyone have a demo or a project that does something smarter 
(or willing to do the busy work of finding the right combo of 
dependencies that just work)

I require basic images, text, and transparent rectangles

https://youtu.be/HxFSmDNvDUI

ideally raylib or image magik for the frame generation
Mar 21 2023
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Mar 21, 2023 at 04:57:49PM +0000, monkyyy via Digitalmars-d-learn wrote:
 My current method of making videos of using raylib to generate screenshots,
 throwing those screenshots into a folder and calling a magic ffmpeg command
 is ... slow.
[...] How slow is it now, and how fast do you want it to be? One possibility is to generate frames in parallel... though if you're recording a video of a sequence of operations, each of which depends on the previous, it may not be possible to parallelize. I have a toy project that generates animations of a 3D model parametrized over time. It generates .pov files and runs POVRay to generate frames, then calls ffmpeg to make the video. This is parallelized with std.parallelism.parallel, and is reasonably fast. However, ffmpeg will take a long time no matter what (encoding a video is a non-trivial operation). T -- Try to keep an open mind, but not so open your brain falls out. -- theboz
Mar 21 2023
parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Tuesday, 21 March 2023 at 17:18:15 UTC, H. S. Teoh wrote:
 On Tue, Mar 21, 2023 at 04:57:49PM +0000, monkyyy via 
 Digitalmars-d-learn wrote:
 My current method of making videos of using raylib to generate 
 screenshots, throwing those screenshots into a folder and 
 calling a magic ffmpeg command is ... slow.
[...] How slow is it now, and how fast do you want it to be? T
I vaguely remember an hour and half for 5 minutes of video when its extremely lightweight and raylib trivially does real-time to display it normally and realistically I wouldn't be surprised if it could do 1000 frames a second. Coping several gb of data to disk(that probably asking the gpu one pixel at a time) to be compressed down into a dozen mb of video is just... temp shit. I should just do something that isnt stressing hard drives extremely unnecessarily.
Mar 21 2023
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Mar 21, 2023 at 05:29:22PM +0000, monkyyy via Digitalmars-d-learn wrote:
 On Tuesday, 21 March 2023 at 17:18:15 UTC, H. S. Teoh wrote:
 On Tue, Mar 21, 2023 at 04:57:49PM +0000, monkyyy via
 Digitalmars-d-learn wrote:
 My current method of making videos of using raylib to generate
 screenshots, throwing those screenshots into a folder and calling
 a magic ffmpeg command is ... slow.
[...] How slow is it now, and how fast do you want it to be? T
I vaguely remember an hour and half for 5 minutes of video when its extremely lightweight and raylib trivially does real-time to display it normally and realistically I wouldn't be surprised if it could do 1000 frames a second. Coping several gb of data to disk(that probably asking the gpu one pixel at a time) to be compressed down into a dozen mb of video is just... temp shit. I should just do something that isnt stressing hard drives extremely unnecessarily.
You could try to feed the frames to ffmpeg over stdin instead of storing the frames on disk. See this, for example: https://stackoverflow.com/questions/45899585/pipe-input-in-to-ffmpeg-stdin Then you can just feed live data to it in the background while you generate frames in the foreground. T -- Lottery: tax on the stupid. -- Slashdotter
Mar 21 2023
parent Ferhat =?UTF-8?B?S3VydHVsbXXFnw==?= <aferust gmail.com> writes:
On Tuesday, 21 March 2023 at 17:46:00 UTC, H. S. Teoh wrote:
 On Tue, Mar 21, 2023 at 05:29:22PM +0000, monkyyy via 
 Digitalmars-d-learn wrote:
 On Tuesday, 21 March 2023 at 17:18:15 UTC, H. S. Teoh wrote:
 [...]
I vaguely remember an hour and half for 5 minutes of video when its extremely lightweight and raylib trivially does real-time to display it normally and realistically I wouldn't be surprised if it could do 1000 frames a second. Coping several gb of data to disk(that probably asking the gpu one pixel at a time) to be compressed down into a dozen mb of video is just... temp shit. I should just do something that isnt stressing hard drives extremely unnecessarily.
You could try to feed the frames to ffmpeg over stdin instead of storing the frames on disk. See this, for example: https://stackoverflow.com/questions/45899585/pipe-input-in-to-ffmpeg-stdin Then you can just feed live data to it in the background while you generate frames in the foreground. T
This is how I use pipe process with d and ffmpeg. I am reading video frames but other way may work too. https://github.com/aferust/oclcv/blob/main/examples/rgb2gray-video/source/app.d
Mar 21 2023
prev sibling next sibling parent reply Guillaume Piolat <first.last spam.org> writes:
On Tuesday, 21 March 2023 at 16:57:49 UTC, monkyyy wrote:
 My current method of making videos of using raylib to generate 
 screenshots, throwing those screenshots into a folder and 
 calling a magic ffmpeg command is ... slow.

 Does anyone have a demo or a project that does something 
 smarter (or willing to do the busy work of finding the right 
 combo of dependencies that just work)

 I require basic images, text, and transparent rectangles

 https://youtu.be/HxFSmDNvDUI

 ideally raylib or image magik for the frame generation
Hi, The idea to pipe stdout to ffmpeg is sound. In the following dead repo: https://github.com/p0nce/y4m-tools you will find a tool that capture a shader, format it into Y4M and output on stdout. Y4M output is useful because it embeds the metadata unlike .yuv See: https://github.com/p0nce/y4m-tools/blob/master/shader-capture/example.sh
Mar 24 2023
parent Guillaume Piolat <first.last spam.org> writes:
On Friday, 24 March 2023 at 15:41:36 UTC, Guillaume Piolat wrote:
 Hi,

 The idea to pipe stdout to ffmpeg is sound.

 In the following dead repo: https://github.com/p0nce/y4m-tools

 you will find a tool that capture a shader, format it into Y4M 
 and output on stdout.
 Y4M output is useful because it embeds the metadata unlike .yuv

 See: 
 https://github.com/p0nce/y4m-tools/blob/master/shader-capture/example.sh
Fixed the URLs. How to output .y4m => https://github.com/p0nce/y4m-d/blob/master/source/y4md/package.d
Mar 24 2023
prev sibling parent Ogi <ogion.art gmail.com> writes:
On Tuesday, 21 March 2023 at 16:57:49 UTC, monkyyy wrote:
 My current method of making videos of using raylib to generate 
 screenshots, throwing those screenshots into a folder and 
 calling a magic ffmpeg command is ... slow.
Why not use ffmpeg as a library? Here are the [bindings](https://code.dlang.org/packages/ffmpeg-d).
Mar 25 2023