20 years

misc

TWENTY

DAMN

YEARS

September 11, 2001 is quite a memorable day for the obvious reasons. But for me, it holds an additional significance. Because on September 10, 2001 I registered the domain bit-101.com and on the morning of September 11 the first version of the site went live, only to be massively overshadowed by other events just a couple of hours later.

Initially the site was a single page with a Flash application containing a calendar that linked to various interactive experimental pieces. I’d started doing the experiments in late August, so I was able to launch BIT-101 with fourteen experiments. It ultimately grew to over 600.

This was the previous site that was retired on 9/11/01, also fully Flash:

That KP logo came in with a really cool animation and there was a funky 5-second free music loop that I snagged off of FlashKit, which got really annoying after roughly 10 seconds.

A later version of BIT-101:

Yeah, I liked the Papyrus font back then. Also… what are lower case letters? All those sections were draggable and closable windows. Peak 2002 “web design”.

BIT-101 lasted in this general form, with various interface changes up until the end of 2005. There were many months I posted something new every day. Towards the end, it got a bit slower.

While all this was going on, near the end of 2003, I started the first BIT-101 blog. I say the “first” one because in late 2017 I did a blog reboot, to the new blog that you are reading here. The old one had a good 14 year run though. And is immortalized here: http://www.bit-101.com/old/. Amazing to think that the blog reboot is now almost 4 years old, which is about as long as the first old Flash site lasted. Time keeps moving faster.

Changes

Things sure have changed since that first site 20 years ago. Back then it was all about Flash for me. I was not working full time as a programmer, but I had a steady flow of side jobs doing Flash work. I’d written a few Flash tutorials on the KP Web Design site and those had done really well. In fact it led me to contributing to my first book, Flash Math Creativity.

This led to many more books, mostly with Friends of ED and Apress, but also OReilly.

In 2003 I was invited to FlashForward in NYC where I was a finalist for their Flash awards ceremony in the Experimental category. I remember being so starstruck meeting all my Flash heroes there – many of whom I consider good friends to this day. As it turns out I won the award, which was amazing. I went back to my day job the following Monday. I was working in the estimation department of a mechanical contracting company. I hated that job. I was thinking, “Why am I here? I am a published author and I just won an award for Flash. That’s what I should be doing.” Amazingly, when the boss came in, he called me into his office. Apparently I had screwed up delivering an estimate the previous week and he fired me. What I remember most clearly about that conversation was trying not to smile as I realized I was free. The next day I went to talk to a company in Boston that I had been talking to about doing some Flash work on the side and said I was ready to go full time. They hired me and thus began my official career as a “professional” developer.

Of course, Flash officially died earlier this year. But I had really moved on from it in early 2011, when I did my “31 days of JavaScript” series on the old blog. The inaugural post here: http://www.bit-101.com/old/?p=3030. This series got a lot of attention and by the end of it I had personally switched over to doing all my personal creative coding using HTML5 Canvas.

In 2018 I started looking for some other platforms for creative code. I discovered Cairo Graphics, a C library that is pretty similar to the canvas api in JavaScript. It has bindings for many other languages. I tried it with Python and liked it, but wanted to learn a new language. I’d been interested in both Rust and Golang. I converted my JS library over to Rust and got it working well. But Rust is a pretty exacting language. I found it hard to work with for something like creative coding. I spent more time trying to satisfy the compiler than I did writing any interesting code. So I tried Go and that really hit the spot. It’s been the mainstay language for my creative work for the last three and a half years, though I still keep active in JavaScript as well.

Work-wise, starting from first job in 2003:

  • Exit 33 / Xplana Learning
  • Flash Composer
  • Brightcove
  • Infrared5
  • Disney
  • Dreamsocket
  • Notarize

I started all of those jobs as a Senior Developer/Engineer/Programmer. At Notarize I am now an Engineer Manager, managing 10 other engineers and not really doing any hands-on coding myself. That’s fine with me. It’s a totally new challenge and I’m enjoying it, especially seeing and helping new grads out of school growing into amazing engineers. Interestingly, only two of those jobs required a formal interview. The rest of them were almost straight to offer from people I had gotten to know well through the Flash community.

Summary

It’s been an amazing 20 years. I had no idea where this was going when I randomly came up with “bit-101” and registered the name back then. But it’s worked out pretty damn well. What about the next 20 years? If I’m still breathing and able to type coherent code, I’ll be cranking out something for sure.

More gif-making tips and tools

misc, tutorial

I’ve been continuing my search on the ultimate gif-making workflow and came across two more tools.

gifsicle

and

gifski

Both of these are command line tools available across platforms.

gifsicle

I first heard about gifsicle a while ago as a tool to optimize gifs. I tried it on some of the larger ffmpeg-created gifs and it didn’t seem to do a whole lot. You can specify three levels of optimization. The lowest level didn’t have any effect. The highest level took a single megabyte off of a 27 mb gif. Not really worth it.

You can also use gifsicle to reduce colors in an existing gif. This got considerable results, but at a loss of quality. I think it would be better to do this in the palette creating stage.

But gifsicle can also create gifs from a sequence of images, just like ImageMagick and ffmpeg. Big drawback on this workflow though: the source images have to be gifs themselves. OK, I was able to put together a quick ImageMagick script to convert all my pngs to gifs. But that took quite a while. I didn’t time it, but I feel like it was more than a minute for 300 frames. As for size, it was almost right in between the sizes produced by ImageMagic and ffmpeg.

But the additional conversion process put this one out of the running for me.

gifski

I think this is a very new solution. And it’s written in Rust, which tends to give projects a lot of street cred these days.

After using it, I am impressed.

The syntax is pretty straightforward:

gifski --fps 30 frames/*.png -o out.gif

I don’t think I really need to explain any of that.

Performance-wise, it hits a pretty sweet spot. Not as fast as ffmpeg, but image sizes are way smaller. Not as small as ImageMagick’s output, but way faster.

Here’s the results I got:

Time:

FFMPEG:       5.780 s
gifski:      19.341 s
ImageMagick: 43.809 s
gifsicle:    with image conversion needed, way too long

Size:

ImageMagick: 13 mb
gifski:      16 mb
gifsicle:    18 mb
FFMPEG:      27 mb

Summary

I feel like it’s pretty straightforward.

If you are going for size, nothing beats ImageMagick, but it takes forever.

If you are going for speed, nothing beats ffmpeg.

If you are dealing with gifs as your source image sequence, gifsicle might be a good compromise.

But I think the overall winner is gifski in terms of hitting that sweet spot. I’ll be using it a lot more in coming weeks and days and update with any new findings.

I should also note that all my tests have been on grayscale animations. Full color source images could change everything.

A note on quality and duration

All of the gifs produced seemed to me to be of very comparable quality. I didn’t see any quality issues in any of them. To my eye, they seemed like the same gif, with one exception – duration.

Actually I discovered today that ImageMagick’s delay property will truncate decimal arguments. Or maybe round them off. I’ve gotten conflicting info. Anyway, I’ve been using a delay of 3.33 to make them run at 30 fps. But it turns out it just puts a delay of 3/100’s of a second. So they’ve actually been running a bit faster than 30fps. Somehow, the gifs created with ffmpeg and gifski do seem to run at the exact fps specified. Specifically, a 300 frame animation set to run at 30 fps should run for 10 seconds, as the ffmpeg and gifski gifs do. But ImageMagick’s finishes in 9 seconds.

I tried some other formats for the delay parameter. Apparently you can specify units precisely, like -delay 33,1000 for 33/1000’s of a second, or even -delay 333,10000 for 333/10000’s of a second. But this doesn’t make a difference. From what I understand, the gif format itself does the 100th of a second rounding. If so, I’m not sure what ffmpeg and gifsicle are doing to make it work correctly.

Big deal? Maybe, maybe not. Worth noting though.

More ffmpeg tips

tutorial

Palettes

In a recent post, I shared making animated gifs with ffmpeg. I hadn’t been doing it very long so I wasn’t sure how it would work in the long run. Lo and behold, I ran into a problem. I was making black and white (actually monochrome grayscale) gifs and for the most part it was going well. But then I saw some yellow getting in there!

None of the still images had anything but grayscale values. So how was I getting yellow? To be honest, I’m not sure of the details 100%, but it’s got to do with palettes. Gifs generally have just 256 available colors. There are tricks to make animated gifs use more than that, but let’s stick with the basic case. 256. The frames you create for your animations will likely be pngs, which mean they can have millions of colors. Somehow, ffmpeg needs to take all those millions of colors and choose just 256 for the gif.

Last time, I posted this command:

ffmpeg -framerate 30 -i frames/frame_%04d.png out.gif

This was working pretty well for my grayscale gifs, but if you tried using it for full color animations, there’s a good chance you ended up with a mess. Because you basically got a random palette. I don’t know how the palette is chosen in that case, but there’s a damn good chance it’s not going to be right. So you need to create a decent palette. That’s done with a palettegen filter. First you run ffmpeg to generate a new 16×16 pixel image (256 pixels) that contains the palette it thinks it should use. That looks like this:

ffmpeg -i frames/frame_%04d.png -vf palettegen palette.png

I’ll keep the same flow of going through each parameter:

-i frames/frame_%04d.png – chooses the frames that contain the palette data.

-vf palettegen – a video filter. The filter is palettegen which generates a palette.

palette.png – the output image holding the palette.

The result, as I said, is just a 16×16 png image. You can open it up and see your palette. This has been done by analyzing all the images in the sequence and determining which 256 colors can be used across the entire animation to make things look decent.

Now you use this palette image in another call to ffmpeg:

ffmpeg -framerate 30 -i frames/frame_%04d.png -i palette.png -filter_complex paletteuse out.gif

Once again, I’ll break it down.

-framerate 30 – the fps just like before.

-i frames/frame_%04d.png – the input frames, just like before.

-i palette.png – yet another input, this time, the palette image.

-filter_complex paletteuse – this is where the magic happens.

out.gif – the output image, just like before.

So the filter_complex one is pretty complex, especially if you try to look up the documentation or examples. You’ll find examples like this (IGNORE THESE!):

ffmpeg -i input.mp4 -filter_complex "[0:v]scale=iw:2*trunc(iw*16/18), \
boxblur=luma_radius=min(h\,w)/20:luma_power=1:chroma_radius=min(cw\,ch)/20: \
chroma_power=1[bg];[bg][0:v]overlay=(W-w)/2:(H-h)/2,setsar=1" output.mp4

or…

ffmpeg -i bg.mp4 -i video1.mp4 -i video2.mp4 -filter_complex \
"[0:v][1:v]setpts=PTS-STARTPTS,overlay=20:40[bg]; \
 [bg][2:v]setpts=PTS-STARTPTS,overlay=(W-w)/2:(H-h)/2[v]; \
 [1:a][2:a]amerge=inputs=2[a]" \
-map "[v]" -map "[a]" -ac 2 output.mp4

You’ll generally find these in Stackoverflow, with instructions like, “Just do this…” and no explanations at all on why you should JUST do that.

If you’re lucky, you’ll at least find something like…

 -filter_complex "fps=24,scale=${SIZE}:-1:flags=lanczos[x];[x][1:v]paletteuse"

I stripped away everything but the filter_complex part there. This one actually came from a good friend, Kenny Bunch, who saw my last article and happened to be digging in to the exact same stuff at the same time, but with full color animations.

This one was still more complex than I needed, but it was the simplest example I found, so I was very thankful.

So back to basics. filter_complex is a way of defining, a… well, a complex filter. You define the filter in a string and can chain together multiple actions to do all kinds of fancy things. The way it works is a series of filter actions that result in an output, and then you can use that output in further actions. Like this, broken down per action:

do first filter action[a]; does the first filter action and stores the output in a variable a.

[a] do another filter action[b]; feeds the output a into the next action, and saves that as b.

[b] yet another action[x]; you get the idea.

So in the above example:

fps=24,scale=${SIZE}:-1:flags=lanczos[x]; Sets the fps to 24, scales the gif to a given size on x and keeps the aspect ratio on y (the -1 param), uses Lanczos resampling to do the scaling. Stores the result in x.

[x][1:v]paletteuse Takes the data in x and uses input 1 (the palette image) as a palette using the paletteuse filter. In short, uses the palette image as the palette for the animation.

In my case, I’d already set the framerate. And I wasn’t scaling anything. so I could get rid of that whole first action. And apparently, ffmpeg was smart enough to figure out that input 0 was the frames and input 1 was the palette, So I could shorten the entire thing down to :

-filter_complex paletteuse

Magical.

Using these two commands, I was able to get a correctly paletted animation, with no yellow:

I still don’t know exactly why I was getting yellow, since all the source frames were completely grayscale. I guess it was just the complexity of calculating the values of all the channels of all the pixels from all the frames. Somewhere along the line, it wound up with a bit less in the blue channel for some reason. And once it started, it just multiplied.

Anyway, couldn’t help thinking of this…

Speed and Size

Other considerations when deciding whether to use ffmpeg or ImageMagick are rendering speed and file size.

ffmpeg will create gifs way faster than ImageMagick. A quick test:

Input: 300 frames, 500×500 pixels each.

ImageMagick: 29.985 seconds

ffmpeg: 4.301 seconds

And the ffmpeg test included generating the palette as well as the animation.

File size is not so good a story though. Same animation:

ImageMagick: 13 Mb

ffmpeg: 27 Mb

Other examples weren’t quite that bad, but ImageMagick always wins handily in the size category.

Of course, the other thing I mentioned last time was that ImageMagic will consistently use so much memory that the process just crashes and fails, whereas ffmpeg never has a problem in that area.

Just Say Yes

One last trick if you’re scripting these commands and you’re constantly having your script stop and ask you if you want to override your previous output gif with a new animation. Just add the -y parameter into your command and it won’t bother you anymore.