If you want a good list of command-line tools to do this sort of thing, have a look at the ImageOptim website (http://imageoptim.com/), which lists all of the command-line tools it combines together.
imageoptim is my current go to... it's pretty solid.
I have an image that imageoptim brought down to 149kb. I ran THAT already compressed image through compressor.io and it brought it down to 51kb. Whoa. Very very minimal degradation in some of the image drop shadows but not enough to care based on that file size
Google mod_pagespeed for Apache/nginx does a good job of compressing images, it does things like use a 4:2:0 YUV colourspace and it also serves webp format to clients that support such things.
The advantage of the mod_pagespeed approach is that you can keep hi-res images in your web page and not have to be concerned with manually compressing things.
You can get decent results on jpeg images using jpegtran, and if you don't mind progressive jpegs, you can get slightly smaller images with a progressive scan table optimized for your images (determining the ideal scan order for your dataset is left as an exercise to the reader):
I'm busy combing through /usr/bin for other trivial utilities that I can turn into a web site for residual income. I was tempted to create a parody site that would execute 'ls' for you, all you'd do is add the site's public ssh key to your system.
Besides offering lossless quality images, JPEGmini allows a much larger file size. Compressor.io only appears to allow 10MB files, which if you're trying to use it for photography, it won't be very useful because the jpeg's produced by a Canon 5D Mark III are about 20MB-30MB in size. It is also only web based, whereas JPEGmini has a native client. Compressor.io looks promising though.
If I had to choose, I'd say the visual quality of JpegMini is fractionally better, but it's hardly perceptible to my eye - they both produce incredible results.
Since compressor.io is down we're shamelessly recommending https://kraken.io instead.
We invest very heavily with dedicated infrastructure (dual-CPU hexacores) and provide unparalleled optimization and compression for both lossy and lossless options. We have a serious API and bulk upload and download options. Build-in CDN (SSD-based) integration is coming within a few weeks.
Are you guys using proprietary image compression tools, or are you doing the same thing as these guys and just sticking a web interface in front of open source tools?
I am always amazed at the level of snark that creeps up whenever someone has the gall to launch a product that is not sufficiently innovative (as defined by most of the commenters here).
It does not matter that this functionality is trivially available to anyone with a CLI. There are tons of people who do not know what a CLI is and would not know what to do with it if they found one. Sticking a nice UI on a utility is a worthy innovation. I can imagine lots of web designers or non-technical people with blogs would find this very useful.
Having more options to compress your images is a good thing. The fact that other services that accomplish similar things exist does not disqualify you from launching a service. How many Chinese restaurants are in your neighborhood?
Let us choose to be on the side of the builder. It is hard enough to create stuff without having to deal with the snark from bystanders in the peanut gallery.
I agree with you. I'm the guy behind compressor.io, and it was never my goal to launch a big competitor to existing platforms. I just tried to create the easiest and simple interface for people who wants a quick and nice way to compress their images.
Compressor isn't a revolutionary image compression tool, it just uses the best open-source libraries to achieve an effective result, comparable to other online tools.
So, say I have 25GB of photos on my OneDrive. How would I go about compressing them all in a lossless way similar to the way this website does it? I use Windows so no imageoptim for me.
For bigger images the quality always gets downgraded to 80 and Huffman tables are optimized. For smaller ones the quality differs so he's using a tool to automatically pick the right quality.
This was my first thought and I actually looked to see if it was a gag site. Otherwise I agree with other comments that it needs to be a CLI tool to become truly useful for me.
How do you innovate? Did you invent a new algorithm? Or the backend is just using special configurations of open source tools? Compressor.io just did 1% better on an image already optimized with gimp.
This probably gives you no better compression than using a CLI tool or something like imageoptim. But where this wins is in the presentation. If you're a lay-person, if you see this, you see magic. You get to immediately see your file size shrink and get proof with an interactive visual widget.
Perfect for my needs. I compress images for blog posts, and rather than having to load a compressor, this is perfect. Go to website, drag & drop, save.
+ I just compressed JPG which I previously saved for web with Photoshop. No loss of quality, 18% smaller filesize.
Excellent work, but as others have said you're launching too early without an API, batch upload or other features everyone on Hacker News would consider "core".
Command line is definitely best for scriptability and repeatability.
The thing I like about ImageOptim is that it runs multiple strategies (for PNG and JPEG at least) and picks the best one. Plus its ease of use for one-off compressions can't be beat.
Photoshop export (yes, even Save for Web) is notoriously inefficient. As others have mentioned here, ImageOptim (and the various command line tools it uses) can make your images significantly smaller (particularly PNGs).
Well, that's a bold claim, but let's see if Photoshop really is inefficient. I took the original iguana image from the Compressor site, and then used Photoshop save for web to bring it down to 250kb (same size as the after image from Compressor).
I picked a spot and zoomed in where it's easiest to see the artifacts.