IEEE will no longer accept Playboy's "Lenna" image for image processing research

True, and yet what is "hot" is also determined by what is seen the most, see advertisement. And even if that was not the case, we don't want software like AI being that biased, either for or against anything. I agree that bans themselves really are not the best tool, but I still have not read a comment anywhere on this thread that offers a non-ban solution. Just lots of posturing...
Because there's no need for a non-ban solution, a ban works perfectly.

Lena Forsén has withdrawn consent to use the photo in technical papers. That really should be the end of the discussion. Unfortunately, copyright law prevents her from doing anything about it directly.

Researchers who use her image are either ignorant or malicious (if they know that she has withdrawn consent, including her image in a paper is simply unethical). A ban solves both of these issues.
 
Last edited:
True, and yet what is "hot" is also determined by what is seen the most, see advertisement. And even if that was not the case, we don't want software like AI being that biased, either for or against anything. I agree that bans themselves really are not the best tool, but I still have not read a comment anywhere on this thread that offers a non-ban solution. Just lots of posturing...
thats a rough one to answer, I mean if we could time travel and snatch a man from like the 1500's who knows nothing about tech and showed him 2 women and asked him whos beautiful, his choices being one who's considered "gorgeous" and one who looks like quasimodo, I think we all know what he'd pick and he's never been advertised to.

its just how humanity is built, we'll always veer toward the traditionally healthy or best looking partner, thats how nature works, animals want the best options for their children, we may be smart but were still animals running on instinct and thats why imho even with AI were going to build those same "laws" in, because without them we'd just be handicapping ourselves. sure, nowadays being hefty or something is ok......but is it really? who am I to say.

it's a grim thing to think in our modern and inclusive world I know, im not for banning anything, because to me, the woman in that photo is pretty much normal to see wandering around where I live.
 
Lena Forsén has withdrawn consent to use the photo in technical papers. That really should be the end of the discussion. Unfortunately, copyright law prevents her from doing anything about it directly.
She can’t “withdraw consent” 50 years later… She consented to the photo shoot with Playboy - and her modeling career basically sprang from that. Don’t think of her as a victim here.

Playboy is the company that owns the image - and they were fine with it being used without permission. Mostly because they gained popularity because of it - and the culture of suing over everything wasn’t prevalent in the 70s…
 
What I find ironic, but not surprising, is that the people who are really losing their **** about this are the people who whine and ***** about "woke". They claim its the "woke" who have their panties in a twist, but its just projection. The IEEE decided to move on from the boys club mentality after 51 years. How horrible.
 
Maybe it's just me, but given the choice to be remembered for my contributions to science and tech or for stripping off in public, I know which I'd go for. It's an odd thing to request, and one I imagine her family may not have been inclined to make if she'd already passed on.

Nevertheless, that research is done and dusted and it's trivial to come up with new and better samples and control data for testing future project, so this is an academically acceptable position to take regardless of the suggested reason for doing so.
 
The people who don't like this change don't have any skin in the game. This imposes literally zero burden on any academic writing a paper.
If it was a useless image, why was it popularized in the first place? In any case, the issue is the ban, not the image itself. Also, the model lost all her rights when she agreed to model for Playboy. Add to the fact that research allows for it's use without consent from anyone.

It's as foolish as banning the terms master and slave in computer languages.

It's like the Academy Awards now. I don't take them seriously at all because they have implemented racial quotas for movies to be considered for an Oscar.
 
Don't know why the image would offend anybody. There are statues and paintings all over Europe depicting nudes of all sorts. Why do those not offend?

Different context. Those statues and paintings are "art", they represent an idealized medium. The Lena image was explicitly from a Playboy centerfold--it was, first and foremost, pornography. By definition, it was made to appeal to a sexual appetite. The fact that it was used in computer science publications does not negate this fact, it just means people were okay using it. The 1970s was 50 years ago and sentiments around the "time and place" for sexually-suggestive content have changed significantly.

In that time, a myriad of better pictures (by better, I mean "context-neutral") for testing contrast and edge detection have been produced, or should have been, because otherwise "what were people doing for a half a century, twiddling their thumbs?" The only real benefit of using the Lena image is to invoke the idea of "what publishing in a science journal in the past felt like", aka nostalgia. Other than that, it has exhausted its usage and should be retired.
 
The reason standard images such as Lena are used in lossy image compression, or more precisely the quantisation process, is that numerical metrics like the peak signal to noise ratio (PSNR) or root mean square error (RMSE) often do not correlate with the subjective perception of quality. Thus benchmarking algoriths solely using numerical metrics can be perceptive, since one algorithm can have way better PSNR or RMSE can result in images with terrible artifacts when observed with the human eye. This makes it hard to properly compare quantisation algorithms. For this reason, images with Lena being the far most common used one, are used to benchmark the algorithms.
While I the other day while preparing an assessment was thinking about whether it would still be appropriate to use Lena, it is a bit sad to see her face being banned. It will now be harder to compare results published in numerous old publications with new "post Lena" ones.
 
True, and yet what is "hot" is also determined by what is seen the most, see advertisement. And even if that was not the case, we don't want software like AI being that biased, either for or against anything. I agree that bans themselves really are not the best tool, but I still have not read a comment anywhere on this thread that offers a non-ban solution. Just lots of posturing...

" we don't want software like AI being that biased.", a bit late for that.
 
I get it if any of this new "research" was on a full body NSFW image, but if they're talking about the headshot as pictured above, have they seen what a typical Instagram feed looks like? And btw I'm no expert but I believe this type of glamor image is most commonly posted and viewed by women.

It's not about the shot itself. It's about the idea that this image exists in the first place for men to look at a beautiful woman with sexual overtones. And in the woke ideology, (heterosexual) male sexuality = bad and evil.
 
The people who don't like this change don't have any skin in the game. This imposes literally zero burden on any academic writing a paper.

The reason for changing wasn't academic in nature. It's cultural and ideological and therefore suddenly we all have skin in the game actually.
Not everyone wants to see their culture forcibly sanitised from anything (hetero)sexual by a tiny minority of extremist woke prudes who happen to be in positions of power.

The original model (Lena Forsén) says it's time to move on, there are plenty of other images that don't have any societal baggage on them, and if a paper can demonstrate their point on that image they can demonstrate it on others.

She can say what she wants but the image isn't hers (Playboy own it) and from the article it's obvious that the move to retire this image didn't originate with her, but with DEI bureacracy and woke activists.

Academics don't need to be distracted by this kind of history when reading a paper. Best case it's a distraction, worst case it's offensive. Doesn't matter which it is to a single person, Nature and IEEE have decided (rightly so) that there's enough other options out there that we don't need to worry about it - just don't use this image and problem solved.

Academics are adults. I don't think they need to be babysat by you or other woke prudes in DEI bureacracies.
Everything is offensive to someone somewhere. Woke prudishness is offensive to me.
 
thats a rough one to answer, I mean if we could time travel and snatch a man from like the 1500's who knows nothing about tech and showed him 2 women and asked him whos beautiful, his choices being one who's considered "gorgeous" and one who looks like quasimodo, I think we all know what he'd pick and he's never been advertised to.

its just how humanity is built, we'll always veer toward the traditionally healthy or best looking partner, thats how nature works, animals want the best options for their children, we may be smart but were still animals running on instinct and thats why imho even with AI were going to build those same "laws" in, because without them we'd just be handicapping ourselves. sure, nowadays being hefty or something is ok......but is it really? who am I to say.

it's a grim thing to think in our modern and inclusive world I know, im not for banning anything, because to me, the woman in that photo is pretty much normal to see wandering around where I live.
There are some things that stay the same, but if you look at pictures or statues of what was considered attractive at different times in history, that was flexible enough that many would not be considered attractive today, especially in terms of BMI ;) but also facial features. That definitely fluctuates over time.
 
Back