Feedback for "Upcoming Changes to PNG Image Support"

png
images

#73

Howdy, thank you for being responsive to this issue and addressing our concerns. While I appreciate the concessions the development team has made to attempt to satisfy the artistic community as a whole, I have a few counterpoints I’d like to share as an illustrator who has moved to twitter as one of my main social media and art sharing platforms.

I’ve been following this thread since its inception as I wanted to see if the backlash would warrant a mediated response that would make all artists happy, not just sketch and pixel artists. This new amendment for preserving PNGs at low resolution absolutely does not resolve the issue for digital illustrators with high levels of detail, such as some of the work I do.

I’ve linked a thread containing a visual example of the size difference in images with non-uniform height-width ratios, such as landscapes. While twitter has always been unfriendly to previewing these images, often focusing the preview on portions of the image that aren’t relevant, the proposed 900px constraints would cripple the level of detail available for artists who only understand uploading with PNGs for the best image quality: https://twitter.com/Oricalcon/status/1081416456095584256

I’m a lucky subset of the population that is both an illustrator and a software/web developer, so I’m a bit more educated on how to preserve image quality across file formats and resolutions. However, the vast majority of artists simply know that PNG format is the best for maintaining image quality when uploading to the web.
These individuals would still attempt to upload their images at ratios higher than 900px at largest, causing the “test” to compress to JPG and litter images with visual artifacts. Using @RWStandard ‘s post as an example, the visual artifacts that occurs would be prevalent throughout these illustrators’ images. The compression is much, much more visible on digital illustrations than in photography.

TL;DR: I would be much more comfortable with these changes if twitter were to provide an image preview, a tool, or a link to a converter such as the one @RME utilized for comparison, that would allow artists to see what their uploaded image will look like, or test images themselves so they can maximize quality while minimizing data costs on Twitter’s end. This allows the artist to put their best product forward within Twitter’s new constraints, instead of throwing compressed images littered with visual garbage out to their followers without understanding why their images were so thoroughly trashed.

If none of these proposed concessions are possible, then the best step would be raising the PNG size limit to a much more reasonable 1200px at largest (Still a whole 750px smaller than popular art sharing platforms such as Tumblr) at the cost of a little bit more of these “global user” speeds, as for complex illustrations, 1200px is a much more reasonable image size than 900px.


#74

@NolanOBrien Please clarify how the uploaded JPEG image quality is estimated.

How do you avoid overcompression? Recompressing an 85% JPEG image will make it smaller in file size continuously degrading visual quality.

If JPEG quality is estimated from quantization tables, there are perceptual encoders like Guetzli that achieve lower bitrate using finer quantization tables.

It would be great if images in both JPEG and PNG formats were not recompressed if savings are too low, e.g. less than 5%, or bitrate is already reasonable.

I believe the major source of complaints about quality is 4:2:0 chroma subsampling (compare [1] and [2]). There are encoders that can achieve good bitrate without it. The Google compression team came to the same conclusions [3].

Using large master images certainly improves quality but relying on browser scaling is a huge performance hit starting with memory, e.g. 2048x2048 uses 16MB, which is critical for many users including mobile. Images with a smaller pixel size could look equally good if they were scaled according to physical pixel dimensions specified in the metadata. For example, macOS follows this convention when displaying images in Preview.

Small PNG images can safely have CSS style image-rendering: crisp-edges / image-rendering: pixelated applied. Default browser upscaling ruins pixel art, and while larger images work better, they are not unaffected.

[1] https://getoptimage.com/media/benchmark/red-flowers-optimage.jpg

[2] https://getoptimage.com/media/benchmark/red-flowers-imageoptim.jpg

[3] https://twitter.com/jyzg/status/1031295238189576192


#75

One random note is this attached PNG has failed to upload (with a 500 ISE) both via the Twitter web UI and the Twitter API. Would be cool to see a fix for this.

I’ve fixed in our app by implementing a resize for certain conditions.


#76

Thank you for the technical reply, @vmdanilov.

Unfortunately, using encoders like Guetzli is completely untenable. It’s fine for sites with static assets but when dealing with many millions of images uploaded from users every day, encoder performance is necessary. Guetzli is two orders of magnitude worse than our codec, which is just not scalable in any way. These advanced JPEG encoders also sacrifice a critical feature of JPEG that is critical, which is progressive loading support.

https://www.keycdn.com/blog/benchmarking-guetzli

This comes up regularly though, so I just want to re-emphasize that we have been extremely rigorous in the tradeoffs we are taking to maximizing both quality and performance with JPEGs. It’s a matter of tradeoffs, there is no perfect choice. We are confident that with the hundreds of millions of Twitter users and billions of people that encounter Tweets all over the internet, we are doing the best we can with images while continuing to experiment with improvements.

If we were always serving the image size that the user uploads, I can see the concern you are bringing up, however we don’t push full resolution images to users until they elect to open the full image. We serve numerous variants (sizes) of an image to better fit the size of the UI it is being displayed in. Most of the time, the image will be 680x680 in size until the user opens it up to see full rez. The value of uploading in high resolution is that all variants will have a good quality image and that when the full image is displayed, zooming on it will be less prone to artifacts being apparent.

Basically, rest assured, this is a non-issue.

We are aware of this but there are technical and practical hurdles to adopting this sort of change. I will take this feedback though and follow up with appropriate product owners to consider what would be necessary to accommodate this and what the consequences of such a change would entail. I personally do think that images that are enlarged should preserve pixel quality avoiding interpolation while shrinking an image to fit should use interpolation – we’ll see what the owners feel about that though, as a separate investigation.


#77

@cstesto, can you somehow provide a link to the original for this image? Our forums do a recompression for attached PNGs so it is not the same that you uploaded and I can’t test your use case.


#78

My proposal is to actually give people more incentive to optimize images before uploading, i.e. define a threshold (5-10%) and preserve original image if the savings from recompressing it with the 85% JFIF quality are below this threshold.

I briefly checked and found that uploaded images are not recompressed on re-uploading. But can you explicitly state whether it is due to JPEG quality estimation or savings threshold? Or is it just deduplication in this case?

Progressive JPEG encoding can be added losslessly using something like libjpeg’s jpegtran.

Guetzli is a byproduct of the new image format PIK https://github.com/google/pik. As noted, it is optimized for very high JPEG quality (around 94) only, and even there it is prone to blocking artifacts https://github.com/google/guetzli/issues/253. I don’t see you using them but there are more practical compressors available https://getoptimage.com/benchmark.

The main concern is actually quality here. Not everyone can upload 2048x2048 images. For example, a 680x680 px icon with explicit resolution of 150 dpi is better displayed at 340x340 pt, e.g. 340x340 px served for non-retina and original 680x680 px for retina screens. Without physical dimensions, the icon will be stretched and blurred for retina users. Uploading a 225-300 dpi image of any size should have the same good quality as uploading a 2048x2048 image.


#79

Thanks again for replying, @vmdanilov!

Ah, I misunderstood. We’ll look at what more can be done to preserve quality in uploaded JPEGs. If users are willing to do the hard work first, that does seem reasonable to figure out a way to preserve what they have already optimized. This would be separate effort from the current change.

That’s correct, an reuploaded image will be unchanged. I believe, at this moment, we consider chroma subsampling, JFIF quality (via the quantization table), resolution and progressive encoding to see whether or not to leave the image as-is. However, this is not something to couple to as it is subject to change. Not trying to be cagey, but providing you explicitly with the requirements for when we avoid a transcode risks broken expectations when we change the heuristics.

That’s fair. Something for us to look into as an optimization in a separate effort. Good point.

I think for any codec that is outside the performance requirements we have (and we’ve tested them all), it will require a separate effort to preserve users doing the encoding themselves. Good feedback, taking this seriously for future work.

I think the recommendations we have still stand.

  1. upload at 2048x2048
  2. scale up with full integer scaling to be up to 2048x2048
  3. upload using PNG8
  4. upload PNG below 900x900

I understand this is imperfect, but we are bending backwards to support as many use cases as possible (including the example you gave with PNG8 or PNG under 900x900). We’ll keep iterating to balance between quality and performance.


#80

Thanks for all the well written replies to give some insight on what’s happening and why.

Something I didn’t see addressed directly is the addition of a link to view the original image
As mentioned most image urls can currently add :orig to see the original

If the reason for this change is to reduce page/image load times for users on low bandwidth internet, wouldn’t it make sense to serve (all) users those smaller (thumbnail) images in their feed, while still allowing users to click ‘View original image’ ?

It seems this could be easily achieved with all the image sizes you already have.
As you said most places already use thumbnails since using the full res images in small containers makes the UI slow)

Having a link to view to original image seems trivial to implement and should not affect loading times for anyone that chooses not to click on it.
People on 2g networks would be aware, or could be warned, that clicking ‘original quality’ would mean longer load times while still allowing everyone in the world to access them

You previously agreed it might be good to link to imgur for the original image, why not help artists and keep users on Twitter.com by doing this for them with the data you already have?

Additionally, this would allow for users to opt in to the original quality in their whole feed if they would install a userscript like the following https://github.com/eight04/twitter-original-image (even better if this was an advanced setting built in to Twitter)

I understand the need to standardize and improve page load times for all users, but having an option to at least manually click ‘View original image’ would give people the choice of how they want to use their bandwidth for Twitter when they can.


#81

Thanks for the comment, @MvEerd. It was well thought out and I appreciate wanting to solve for all users.

First, I should point out that the orig variant of images is an internal only variant. Twitter makes no guarantees related to it and it is not intended for any outside consumption. Don’t read orig as meaning “original” but as meaning “origin”, aka from the source in our data center. I would dissuade users from ever accessing that variant.

Your suggestion to have a way to persist access to the originally uploaded version of the image is fair and not unheard of. But your assertion that it would be “trivial to implement” is unfortunately not the case. Our system, built around scalability, is not really like a file system where we can store arbitrary files and be selective of which version we serve out. It would be a very complex effort and would likely require an entirely new service to be spun up for this special case of hosting original images.

Do also remember that extra variants (and increased file sizes) is not purely a matter of transferring the data, it also has an impact on our CDN hit rates. More large files that get cached on CDNs mean even more smaller files being purged from our CDNs, hurting hit rates. For every user with fast internet getting the heavy file size image they want in the highest fidelity, there are many more users that will be faced with slower content loading because the content they are trying to see is not in proximity to them anymore since it was purged from the CDN to make room for the heavy files. It’s a non-trivial balancing act, but we’ll keep working to improve things.

As we continue to improve the balance between quality and performance, these ideas will be kept in mind.


#82

How much information can you give about how to tell whether a PNG file will be recompressed based on a hypothetical JPEG version’s filesize?

I’m making a tool that allows users to optimize their art before uploading to twitter, by losslessly compressing it in optiPNG to see if it’ll come in small enough, and then offering to lossily compress it to 256-color in case that looks less bad than the eventual JPEG compression will.

But how are users supposed to know what approach to take? What good does an “if it’s smaller than the JPEG” clause do, if users don’t KNOW whether it’s going to be smaller than a JPEG whose settings we don’t know? The only way to learn what approach to take is to have already posted the image and seen what filesize it gave once it was online in front of everyone.

I know you said you don’t want to say in case the settings wind up changing, but I can tell you now, on February 11th I’m going to upload a test file, then look at the result in exiftool, and base my script on whatever it gives, and so will anyone else trying to deal with this vague rule that we’re depending on, so you may as well tell us what the initial settings are going to be, rather than making us scramble on the day the change goes live to find out the same information the hard way.


#83

Just to be clear: I’m aware of the “85% quality” figure, but I’ve gotten very different filesizes out of the same “quality” setting in different programs, according to things like 4:4:4 vs 4:2:0, etc…

Can you give us a few example files to compare against, to at least identify a way to locally compare results before the day that the switch is thrown?


#84

Really great questions, @RavenWorks. I really appreciate that you are trying to work within the boundaries to make new tools to benefit all users.

First, I think the easiest thing to do is, if the image has 256 colors or less, just automatically convert to PNG8. I would like to work with the team to make that more automated on our side. For now that is just an idea and so external tooling doing it would make sense until I can make the case internally.

Using something like optiPNG or oxiPNG is a great idea, I like that a lot. It cannot scale internally, but moving that onto the user’s computer makes it much more likely to be viable.

For comparing between JPEG to see if it meets the threshold, I will share the specifics of our internal codec – however, that is entirely subject to change (as you’ve noted) and I cannot guarantee it will persist. The threshold could itself end up changing too (hopefully to be more permissive). In any-case, the closest match you can get is the following:

  • libjpeg-turbo codec
  • progressive encoding
  • 85% JFIF quality
  • 4:2:0 chroma subsampling

I hope this helps with tooling you might want to develop, but it doesn’t really come with guarantees. It should be pretty good though.


#85

Thank you so much! :slight_smile: That’s definitely a helpful starting point. Do you think there would be public announcements somewhere if it were expected to change in the future?


#86

That’s a fair question, @RavenWorks, but I’m afraid not. The public communication has been intentional to preserve areas that are subject to us making changes for the benefit of the platform, and it would be untenable to try to couple that to public announcements/documentation. I hope there could be a way, perhaps with the media upload Twitter API, to detect when changes do happen for developers that are concerned enough, but I can’t make any promises for communications.

If you ever are wondering if something changed though, you are free to contact myself (@NolanOBrien) or @andypiper and we can see about finding out. I’m sorry if that’s not a great dependency to have, but I want to be transparent to set expectations.


#87

That’s alright, thanks anyway! Maybe I’ll just make a script that uploads a test image and compares the resulting file once a month or something…


#88

@NolanOBrien thanks for getting back to me.

I believe this link should be accessible to you: https://pasteboard.co/HZcJcJP.png

I downloaded and it was the same size in Finder as the original image, 2.9 MB.

I also tried attaching this image to a tweet via the Twitter desktop UI and publishing. It failed. Assume this still happens with the API, too.


#89

@cstesto, thanks for sharing. I can reproduce the issue which means our engineers should be able to debug it. I’ll file a bug report right away. Thanks!


#90

Is there a particular time of day planned for the changeover?


#91

Good question. The rollout will take a while and will likely roll out in phases throughout the week. Today (Feb 11th) will mark the first phase and a subset of users and image categories getting moved over. We hope to complete the transition by the end of the week, but it might take longer. I’ll comment on the announcement once it has completed.


#92

hey there! it’s me from earlier in the thread, i just wanna say i’m happy with the compromises made.

particularly, i am very interested in avatars being left alone. that has been a big point of contention for me since avatars started being compressed years ago. but i have a concern - will this change apply to tweetdeck, as it did before?
currently, my icon looks a LOT worse on tweetdeck than it does on web twitter, i’m not sure how tweetdeck handles icons, since it uses different sizes, but here’s a visual comparison anyway;

i believe that my web twitter is still compressing avatars, but tweetdeck just looks horrendous in comparison. do you know if the changes will affect tweetdeck or not? i wrote this at 12:18pm PST roughly. thanks!

EDIT: just to clarify, tweetdeck changed to compressing avatars when web twitter did. im not sure how linked the two are, but yeah!