Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Compressor.io – Compress and optimize your images (compressor.io)
129 points by davidbarker on May 22, 2014 | hide | past | favorite | 71 comments


This isn't magic, it's just optipng. I did this locally and got the same file (literally, with the same md5sum):

    optipng -o 3 foobar.png
Also, MediaCrush does the same thing transparently+losslessly when you upload a number of files, and it's open source. https://github.com/MediaCrush/MediaCrush


Has anybody tried Piped Piper?


Their Weissman scores are incredible!


Why bother with Pied Piper? Nucleus will launch soon and I'd rather trust my data to an established brand like Hooli than an unknown startup.


im working on something similar, it gave me an insight i thought i forgot about a bit ago.


Yeah, but their 3D video compression sucks


Another open source option is PageSpeed, which does it on the fly as an Apache/Nginx plugin.

https://developers.google.com/speed/pagespeed/module

(Disclaimer: I work on this.)


Maybe for that image, but I got totally different results with a sample PNG (with Compressor being much smaller than optipng).


Did you set it to lossless? I didn't investigate the lossy conversion because I can't stand lossy compression.

Edit: Just tested with another image, same result.


Looks fine, but unless this is a command line tool I can integrate into my build scripts then I'm just not going to use it.


If you want a good list of command-line tools to do this sort of thing, have a look at the ImageOptim website (http://imageoptim.com/), which lists all of the command-line tools it combines together.


imageoptim is my current go to... it's pretty solid.

I have an image that imageoptim brought down to 149kb. I ran THAT already compressed image through compressor.io and it brought it down to 51kb. Whoa. Very very minimal degradation in some of the image drop shadows but not enough to care based on that file size


If you use ImageAlpha before ImageOptim and reduce the colors, you will get even smaller files. Lot's of PNGs will not contain many colors anyway.


Google mod_pagespeed for Apache/nginx does a good job of compressing images, it does things like use a 4:2:0 YUV colourspace and it also serves webp format to clients that support such things.

The advantage of the mod_pagespeed approach is that you can keep hi-res images in your web page and not have to be concerned with manually compressing things.


Yeah unfortunately I agree.

Also how does it compare to other tools such as ImageAlpha http://pngmini.com ?


You can get decent results on jpeg images using jpegtran, and if you don't mind progressive jpegs, you can get slightly smaller images with a progressive scan table optimized for your images (determining the ideal scan order for your dataset is left as an exercise to the reader):

    jpegtran -optimize -progressive -scans scan_table.txt image.jpg > image_optimized.jpg


I use pngcrush for that, but there are many to choose from.


For pngs you will get better results by using optipng and then pngout on the result.

    optipng in.png -o 5 -strip all -out temp.png
    pngout -f6 -kp -ks temp.png out.png
For jpegs, use jpegrescan.

    jpegrescan -s -t in.jpg out.jpg


I'm busy combing through /usr/bin for other trivial utilities that I can turn into a web site for residual income. I was tempted to create a parody site that would execute 'ls' for you, all you'd do is add the site's public ssh key to your system.


How does this compare with jpegmini[0]? It's actually one of the best compressors I know (and use)

EDIT:

Compressor.io (lossless): Before 4.72 MB | After 4.72 MB

Compressor.io (lossy): Before 4.72 MB | After 1.27 MB

JpegMini (lossless): Before: 4.72 MB | After 1.3 MB

Test image: http://imgur.com/gbOCJxX

[0] http://www.jpegmini.com


JpegMini claims to be perceptually lossless, not mathematically lossless.


Besides offering lossless quality images, JPEGmini allows a much larger file size. Compressor.io only appears to allow 10MB files, which if you're trying to use it for photography, it won't be very useful because the jpeg's produced by a Canon 5D Mark III are about 20MB-30MB in size. It is also only web based, whereas JPEGmini has a native client. Compressor.io looks promising though.


Looks like it depends on the image.

Image1 reduction: Compressor.io 28.67% | JpegMini 34.35%

Image2 reduction: Compressor.io 69.55% | JpegMini 55.41%

If I had to choose, I'd say the visual quality of JpegMini is fractionally better, but it's hardly perceptible to my eye - they both produce incredible results.


Since compressor.io is down we're shamelessly recommending https://kraken.io instead.

We invest very heavily with dedicated infrastructure (dual-CPU hexacores) and provide unparalleled optimization and compression for both lossy and lossless options. We have a serious API and bulk upload and download options. Build-in CDN (SSD-based) integration is coming within a few weeks.


Are you guys using proprietary image compression tools, or are you doing the same thing as these guys and just sticking a web interface in front of open source tools?


I'm a customer, and I don't really care what drives it. Kraken works very well, I use it all the time.


It's great that it suits you, but I want to know whether it provides compression levels that I'm not already getting.


I am always amazed at the level of snark that creeps up whenever someone has the gall to launch a product that is not sufficiently innovative (as defined by most of the commenters here).

It does not matter that this functionality is trivially available to anyone with a CLI. There are tons of people who do not know what a CLI is and would not know what to do with it if they found one. Sticking a nice UI on a utility is a worthy innovation. I can imagine lots of web designers or non-technical people with blogs would find this very useful.

Having more options to compress your images is a good thing. The fact that other services that accomplish similar things exist does not disqualify you from launching a service. How many Chinese restaurants are in your neighborhood?

Let us choose to be on the side of the builder. It is hard enough to create stuff without having to deal with the snark from bystanders in the peanut gallery.


I agree with you. I'm the guy behind compressor.io, and it was never my goal to launch a big competitor to existing platforms. I just tried to create the easiest and simple interface for people who wants a quick and nice way to compress their images. Compressor isn't a revolutionary image compression tool, it just uses the best open-source libraries to achieve an effective result, comparable to other online tools.


So, say I have 25GB of photos on my OneDrive. How would I go about compressing them all in a lossless way similar to the way this website does it? I use Windows so no imageoptim for me.


You use optipng, which this site just wraps.


What about JPEGs?


For bigger images the quality always gets downgraded to 80 and Huffman tables are optimized. For smaller ones the quality differs so he's using a tool to automatically pick the right quality.

@wouwi: what's the tool used for JPEGs?


jpegoptim and jpegtran


Sounds like another Pied Piper clone to me.


This was my first thought and I actually looked to see if it was a gag site. Otherwise I agree with other comments that it needs to be a CLI tool to become truly useful for me.


How do you innovate? Did you invent a new algorithm? Or the backend is just using special configurations of open source tools? Compressor.io just did 1% better on an image already optimized with gimp.


I've been working on something similar with Github integration at https://shrinkray.io

Instead of compressing a single file at a time, or using a CLI / api, you can get one-click optimization for all of the images in your repo.

I'm hoping to add automatic image optimization when new images are committed at some point, but it's a side project, so a little slow going.


I see this on the site:

> Jabatus EX503 - Voir http://www.jabatus.fr

Is it hacked or what?


This probably gives you no better compression than using a CLI tool or something like imageoptim. But where this wins is in the presentation. If you're a lay-person, if you see this, you see magic. You get to immediately see your file size shrink and get proof with an interactive visual widget.


Perfect for my needs. I compress images for blog posts, and rather than having to load a compressor, this is perfect. Go to website, drag & drop, save.

+ I just compressed JPG which I previously saved for web with Photoshop. No loss of quality, 18% smaller filesize.


For lossy PNG it is using pngquant, so if you want a command-line version: http://pngquant.org or local GUI: http://pngmini.com


What's the best way of doing this in Rails?

So far all I got is https://github.com/emrekutlu/paperclip-compression (jpegtran + optipng).


Does not work with chrome for me ( http://imgur.com/hblv5lr ) and when I click download, I get a not found page.


Excellent work, but as others have said you're launching too early without an API, batch upload or other features everyone on Hacker News would consider "core".


Has anyone tried sharp for node? Looks quite promising.

https://github.com/lovell/sharp


wow those benchmarks are pretty impressive


I need a bulk mode, CLI tool, or an Apple script.

Edit: Pretty please :)


ImageOptim is what you want: http://imageoptim.com/


yeah imageoptim is my current go to on mac. so good. cmd line is better.


Command line is definitely best for scriptability and repeatability.

The thing I like about ImageOptim is that it runs multiple strategies (for PNG and JPEG at least) and picks the best one. Plus its ease of use for one-off compressions can't be beat.


Thank you!!



It would be nice to have some information on the site about how this actually works and what advantages it has over exporting from Photoshop.


Photoshop export (yes, even Save for Web) is notoriously inefficient. As others have mentioned here, ImageOptim (and the various command line tools it uses) can make your images significantly smaller (particularly PNGs).


Well, that's a bold claim, but let's see if Photoshop really is inefficient. I took the original iguana image from the Compressor site, and then used Photoshop save for web to bring it down to 250kb (same size as the after image from Compressor).

I picked a spot and zoomed in where it's easiest to see the artifacts.

http://i.imgur.com/Sf72oZq.png

If anything, I'd say the Photoshop one is better than Compressor.


Why can't Adobe, with all their resources, simply bring "Save for Web" up to par with JPEGmini and Compressor.io?


I'm assuming licensing issues.


Is the site down/been hacked?


I think it was hacked... This is the text I'm getting from the page:

Jabatus EX503 - Voir www.jabatus.fr


The lossless compression does not compress better than photoshop.

And the lossy one is bad.


really need an API for this. i'd build a quick WordPress plugin and would like to build this into custom app wysiwyg's.

Compression seems pretty good with very minimal degredation


There's smush.it for wordpress:

https://wordpress.org/plugins/wp-smushit/


Tried a logo; result is smaller but looks like crap.


This should be a Dropbox feature!


API ?


It is just optipng


nice work!


How does this compare to Hooli or Pied Piper ?


Haha, good one.


Bro, it's "Nucleus" not Hooli.

Its Weismann score sucks, nothing like PP.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: