So I was trying to implement ping-pong buffers in my #webgpu library (emphasis on trying). The code has no errors, but the values are not changing, and I think is basically how my library is built because I think (because I couldn't figure it out) the entries are being swap twice or not being swap at all and am tired, but I have another idea to test.
Last year, I was working with #AI agents before this new "boom". I was really amazed by it and the opportunity that only I seemed to recognize on my team.
All that faded away as soon as you realize these agents behave like little kids that provide a non deterministic answer each time; and to answer my PM the question "why this doesn't work all the time?" with "IDK" was not enough.
I think the hard part for me as a developer is to explain to non tech staff in a company why this that seems to spit out coherent results is not good enough. Even now it's hard to put it to words, but all I can say is that the ones who know (something, any subject) can't explain it because we don't have the intermediate language for it
I heard once that the best people to explain something are the ones who recently learnt a subject. I think the longer the time it passes those mental connections to regular language vanish and a concept can't be explained, it can only be "thought", and you might end up explaining something in jargon
My point here is that, only experienced developers know letting AI touch everything is a bad idea, but we are so knee deep into the waters of code that we can't explain why, but I do have an analogy
Years ago I was working on a Python project. That was the backend of an Angular project. I finished my part and I was done, but higher ups decided to move me to the Angular side I didn't know anything about. My coworker knew a great deal of it so I used him as my search engine, I asked him everything, and why not if he was just there. This was nice and all until one day he was gone because he resigned a month before
So now I was in a project I barely knew the stack and my information resource was gone... I learned my lesson and decided to never rely again later on coworkers about a project...
I never felt that feeling again except when I started to work with AI agents... you feel a dependency, you feel like you can't do it without it, you feel like you are not enough. But you are.
So when tragedy strikes and it will, it could be a crash on the Prod app or your coworker resigns... you have to be ready
For the new version of POINTS I'm adding a way to load an HTMLElement as a texture. I think it's a simple way to load a weird font asset without an atlas/spritesheet. Or basically anything in the DOM.
Screencapture of a terminal with the Fresh IDE loaded into a folder with a JavaScript project. It looks like vscode, directory tree on the left, tabs with files opened on the right.
Another experiment with my POINTS library. This is basically an old physarum/slime demo I made a while ago, but rebuilt with compute shaders, particles and a lot more efficient than before.
The idea of the physarum is that the particles leave a trail and are capable of following trails from other particles if the trail is stronger than the other particles, so they eventually follow each other.
The cool thing about this concept is that you can make the particle follow other things rather than the trails, like in this case a video.
The particles have two options: spawn from the center or spawn randomly, and this changes a bit how the video is interpreted.
With the new version of my WebGPU library POINTS v0.6.0 (https://github.com/Absulit/points), I added a bunch of new features like support for depth maps, cameras, bloom and others, so I wanted to make a demo that used all of these.
This demo has a 262K+ instanced particles (cubes) animated in compute shader, movement driven by a noise texture, it uses the depth map to create a slight depth of field and also some particles are emissive and to make the effect complete it has some bloom that affects only the bright particles, this thanks to the HDR output that allows those cubes to have a brightness beyond white and then those values are tonemapped to fit in the final output.
I think I understand better compute shaders than before. A thing that still eludes me is the fact that, even when I know I have 1 million particles, they are not all visible, even when I thought that was the case. I'm not entirely sure what it is, (not an expert but I manage to) it could be some sync issue related to random numbers, like it generates the same value and the particles overlaps
Discovered a silent bug in my WebGPU library while creating/updating a demo:
I usually create textures that copy and preserve the render pass output for future render passes to use, but for some reason never created one to read from the pass 0.
Internally a check verifies this index or if it doesn't exist to add the current pass data; so technically if the pass is 0 it will add the next pass data into it, you want it or not.
I love falsy values, I love them because I love the idea of something being 0 equals null (like other programming languages like PHP and C) but every once in a while, a little surprise like this one shows up.
That being said, I wouldn't use Typescript for this. I still don't like the idea of "compiling" in an environment that is interpreted (the web).
So, I created a one time click url in a website (custom website) and I wanted to test it with a friend via #Discord, and he told me the url didn't work.
I shared the url fragmented so the url is not clickable, but then he can copy and paste it and it worked.
My #question is, does Discord is scanning / clicking all the urls?
I was not aware of the fact if there's video (which I do have sometimes) it then needs to be updated.
I created a bindgroup just for video, but then it cascaded to the fact I need a new pipeline just for this, and if I do that I need to have another set of shaders or a way to separate/remove/move the bindings from the other pipelines to the others (I think) and it's quite late and I don't want to do this.
@tojiro thanks Brandon. Taking a quick look I see I might not understand everything because it seems to be lower levels (because of the jargon) from what I am used to, but I will try and will get back to you. Maybe I will need a bit of help trying to understand but I will let you know.
@tojiro OK I've finally taken a look at this. Had to use some Copilot to understand a few things and match with my particular case with video.
So let me know if this is correct. With your proposal you can just kind of "plug" a separate bindGroupLayout, and use a specific binding, is that correct? So in my case I would have a new bindgroup and bindgrouplayout with the video reference, and then in the bundle I say: use group X
Instancing demo with several Stanford's Lucy models textured. It uses a depth map to create a somewhat cheap depth of field, blurring the output in the distance.
This is a recreation of an old particles demo I made for a AFrame / 8th Wall but in WebGPU. In this case also using instancing I'm creating a burst of particles (cubes) that are removed after a while. The shader is shared on all the cubes and that's what makes the gradient be so uniform across all of them. With a click of a button the shader can be modified and switch from a flat to a Lambert shading.
A small gap between where three large slabs of limestone come together, forming a patio has become a pocket garden. The arrangement contains red mini snap dragons, calamint, and moss roses in pink, white, and yellow. A red paperclip style lawn chair towers above the flowers.
A huge flock of sandhill cranes flew over while we were digging potatoes this afternoon. I am a bit sad about the birds leaving and winter coming, but I am also very full of potatoes now so that makes it a lot easier to cope.
Pretty clouds in the sky, with a bit of blue sky peeking through, and a sort of "W" shape of sandhill cranes flying below the clouds.