It's kind of a meme within the photography community though. People will spend many thousands of dollars on a camera that's supposedly "the best" (pick your fave reasons, ideally as obscure as possible) and then not actually shoot with it. Looking at yall, Leica fans.
Yeah this what I immediately think too any time I see an article like this. Adjustments like contrast and saturation are plausible to show before/after, but before any sort of tone curve makes no sense unless you have some magic extreme HDR linear display technology (we don't). Putting linear data into 0-255 pixels which are interpreted as SRGB makes no sense whatsoever. You are basically viewing junk. It's not like that's what the camera actually "sees". The camera sees a similar scene to what we see with our eyes, although it natively stores and interprets it differently to how our brain does (i.e. linear vs perceptual).
Eh, I'm a photographer and I don't fully agree. Of course almost all photos these days are edited in some form. Intent is important, yes. But there are still some kinds of edits that immediately classify a photo as "fake" for me.
For example if you add snow to a shot with masking or generative AI. It's fake because the real life experience was not actually snowing. You can't just hallucinate a major part of the image - that counts as fake to me. A major departure from the reality of the scene. Many other types of edits don't have this property because they are mostly based on the reality of what occurred.
I think for me this comes from an intrinsic valuing of the act/craft of photography, in the physical sense. Once an image is too digitally manipulated then it's less photography and more digital art.
So I'm guessing most people are downvoting this as a knee jerk reaction to the comparison with slavery, but I think the core point is quite valid.
At some point, if people are unhappy working towards some goal, you gotta re-evaluate if the goal is worthy. I consistently meet people in other industries who really enjoy their job, whereas in tech, most of the people I know consider their job to be one of the lowlights of their life. And I don't think it's a stretch to say many, many tech jobs are not serving a worthy goal.
So it's disappointing to see people who can't look past "but business value bro", as if we got where we are because capitalism is some holy, inevitable universal law.
So for the same set of actions, it's fine if you're unaware of the underlying mechanisms, and manipulation if you are aware?
If you dig through the weeds of it you can argue just about everything we do socially is manipulation. We are social because we're social animals and will die without help from other humans (well, particularly thousands of years ago). At the end of the day, we are nice to people to get things from them that we need - food, shelter, knowledge, strength. It's always been like that. But because it makes us feel fuzzy and good, apparently that's not manipulation, that's being nice.
You can absolutely be charming towards people and play the "game" of social interaction while being quite aware that this is what you're doing. The point is that this need not involve outright lying or BS at all and that the latter is what such terms as "manipulation" actually imply in a very practical sense; not that it somehow counts against you if you're aware of what's happening at a pure level of social interaction. (In fact, the opposite is generally the case; active social awareness and mindfulness is a big part of what people variously call "EQ", "empathy", "cross-cultural competence", etc.)
Looking at the definition of manipulation, it occurs to me that manipulation must be a win-lose situation. Otherwise it is persuasion. You could use the same technique, but if it is win win for both it is persuasion, but if you are gaining from their expense it is manipulation. At least according to Wikipedia.
There are also white lies. Are you manipulating children if you are claiming santa exists? Are you manipulating a person if you either omit a truth or do a white lie because you know truth at that moment in time would be worse for their life.
That seems a little bit of an odd interpretation to me.
Persuasion is honest. "Hey, I think you should do this thing because of reasons a, b, and c, there are some downsides like y and z. It may mean something to me peronally, so I may also to appeal to you to do it for me as a favour. I may even play up how important I think it is."
Manipulation is dishonest. "Hey, I'm going to use an underhanded technique to make you feel like you're missing out on something, or are inadequate, to get you to do this thing. Maybe I'll go overboard on flattery and inflate your ego to achieve my end. I also might lie or omit some of the downsides to give a distorted view of the risks"
Even if it's a win-win situation, it's still manipulation if you're seeking to bypass someone's agency.
> Are you manipulating a person if you either omit a truth or do a white lie because you know truth at that moment in time would be worse for their life.
Yes, certainly, and that's why people often get upset about "little white lies" too. Maybe you are doing a good thing, maybe you're not, but removing agency from someone by keeping the truth from them is always manipulative.
The wiser question may be "is manipulation always wrong?"
And I'd argue that if it gets your kids to calm down and go to bed on Christmas Eve, maybe not ...
If I get sad or angry when a friend tells me a story, this feeling is a expression of my inner state, not a strategic choice I make to get to a certain place. And this inner state usually translates into how people act later. So if I am enraged how my friend was treated I may be inclined to take steps that help them get even, for example.
Manipulation, however, is when I (feeling nothing), pretend to feel a thing with the goal of getting a certain response.
The border between the two is of course not totally clear-cut and people can manipulate themselves into truly feeling things without following through with any actions etc. So a complex topic, but the reason why the manipulation works in the first place is because the feelings people express towards us are more often than not an expression of how they will act towards us as well. If a guy on the street screams at you, your #1 interpretation won't be that he does it to manipulate you, but that that person is experiencing an actual feeling that may convert to physical action pretty soon.
"we are nice to people to get what we want" is flat out not true. We are nice to people because cooperative societies out performed the non-cooperative ones on the macro level. On a micro level this kind of attitude sometimes/often prevails, we call the people who act like this "jerks", and the people who try to justify it with these kinds of rationale "sociopaths", because to the group as a whole its so incredibly damaging, and to the individuals on the other side of it, insufferable.
> We are nice to people because cooperative societies out performed the non-cooperative ones on the macro level
I.e. biology gets what it wants... We want to survive, mother nature wants us to survive, society wants to survive.
I am absolutely not suggesting that outright jerkish behaviour is acceptable (although to suggest jerks have no social success is probably untrue; plenty of people who are attracted to jerks). I am arguing that if there was no personal advantage whatsoever to being social and nice to people, we wouldn't do it. We'd be lone animals, spread out across the land rather than concentrated in towns and cities. There's a spectrum of selfish behaviour, right? We are somewhere in the middle because it's advantageous to be.
Both are true. We want to survive and being nice to others increases our likelihood of survival. Wanting to survive is also selected by evolution and wanting to be nice in order to survive in a group setting that increases survival odds too.
> Compressing data means you save space on the disc... If you conveniently ignore the fact that common.lin is duplicated in each map's directory and is the same for every map I tested, which kinda negates part of this.
This is an interesting thing I've noticed about game dev, it seems to sometimes live in a weird space of optimisation requirements vs hackiness. Where you'll have stuff like using instruction data as audio to save space, but then forget to compile in release mode or something. Really odd juxtaposition of near-genius-level optimisation with naive inefficiency. I'm assuming it's because, while there may be strict performance requirements, the devs are under the pump and there's so much going on that silly stuff ends up happening?
There was a running theme in mythic quest about the engineers sweating over the system while monetisation just went bolted on a casino.
Also happened in GTA5 [0] there was a ridiculous loading glitch that was quite well documented on here a while ago. Also a monetisation bolt on.
So you have competing departments one of whom must justify itself by producing a heavily after my system. And another one which is licensed to generate revenue at any cost……
There was one similar issue with DOOM framerate, I'm assuming an intern got tasked with adding the "blink the LED on the fancy mouse" code (due to a marketing partnership) and it absolutely _trashes_ the framerate!
Loading happen once per session and is less painful than frame stuttering all game, for example, so given a tight deadline one would get prioritized over the other
I tried playing GTAO when it was free, and oh boy. Loading for 10 minutes, arrive into the game and see you're not with your friends. So 10 more minutes to load into their server. Then you start a mission and 10 more minutes of loading. The server disconnected? 10 minutes load to go back without your friend. Join your friend? guessed it: 10 more minutes of loading.
For a billion dollar game, it's insane I spent more time loading than playing the game. Imagine how many more $$ they could have gotten if players could double their play time.
Loading in GTA Online absolutely does not happen once per session. It happens before and after every mission and activity. I am not sure whether it's a full load/was also affected by that bug, but I can certainly tell you that around 20% of my GTAO "playtime" consisted of staring at a load screen.
Exactly that - once it’s shipped it’s shipped. Doesn’t matter if the code is “clean” or “maintainable” or whatever.
The longer it’s not released for sale, the more debt you’re incurring paying the staff.
I’ve worked with a few ex-game devs and they’re always great devs, specifically at optimising. They’re not great at the “forward maintainability” aspect though because they’ve largely never had experience having to do it.
common.lin is a separate file which I believe is supposed to contain data common to all levels _before_ the level is loaded.
There's a single exported object that all levels of the game have called `MyLevel`. The game attempts to load this and it triggers a load of the level data and all its unique dependencies. The common.lin file is a snapshot of everything read before this export. AFAIK this is deterministic so it should be the exact same across all maps but I've not tested all levels.
When loading a level, the training level for instance contains two distinct parts. Part 1 of the map loads 0_0_2_Training.lin, and the second part loads 0_0_3_Training.lin. These parts are completely independent -- loading the second part does not require loading the first. It does a complete re-launch of the game using the Xbox's XLaunchNewImage API, so all prior memory I think should be evicted but maybe there's some flag I'm unaware of. That is to say, I'm fairly confident they are mutually exclusive.
So basically the game launches, looks In the "Training" map folder for common.lin, opens a HANDLE, then looks for whichever section it's loading, grabs a HANDLE, then starts reading common.lin and <map_part>.lin.
There's multiple parts, but only one common.lin in each map folder. So no matter what it's not going to be laid out in a contiguous disc region for common.lin leading into <map_part>.lin. Part 1 may be right after common.lin, but if you're loading any other part you'll have to make a seek.
I don't know enough about optical media seek times to say if semi-near locality is noticeably better for the worst case than the files being on complete opposite sector ranges of the disc.
They were doing this kind of optical media seek times tests/optimisations for PS1 games, like Crash Bandicoot.
You certainly have more and better context than me on this console/game, I just mentioned it in case it wasn't considered.
By the way, could the nonsensical offsets be checksums instead?
IIRC the average seek time across optical media is around 120ms, so ideally you want all reads to be linear.
I remember one game I worked on, I spent months optimising loading, especially boot flow, to ensure that every file the game was going to load was the very next file on the disk, or else the next file was an optionally loaded file that could be skipped (as reading and ignoring was quicker than seeking). For the few non-deterministic cases where order couldn't be predicted (e.g. music loaded from a different thread), I preloaded a bunch of assets up front so that the rest of the assets were deterministic.
One fun thing we often did around this era is eschew filenames and instead hash the name. If we were loading a file directly from C code, we'd use the preprocessor the hash the code via some complicated macros, so the final call would be compiled like LoadAsset(0x184e49da) but still retain a run-time hasher for cases where the filename was generated dynamically. This seems like a weird optimisation, but actually avoiding the directory scan and filename comparisons can save a lot of unnecessary seeking / CPU operations, especially for multi-level directories. The "file table" then just became a list of disk offset and lengths, with a few gaps because the hash table size was a little bigger than the number of files to avoid hash conflicts. Ironically, on one title I worked on we had the same modulo for about 2 years in development, and just before launch we needed to change it twice in a week due to conflicts!
Mel's job was to re-write
the blackjack program for the RPC-4000.
(Port? What does that mean?)
The new computer had a one-plus-one
addressing scheme,
in which each machine instruction,
in addition to the operation code
and the address of the needed operand,
had a second address that indicated where, on the revolving drum,
the next instruction was located.
>By the way, could the nonsensical offsets be checksums instead?
If you're referring to those weird "addresses" that quickly became irrelevant, there's a CRC32 somewhere in the header immediately after them. The address value is the same across files with different contents too.
I was talking to a friend of mine about it and he suggested that maybe whatever process generated the files included the file's load address in case it could be mapped to the same address for some other optimization?
ISO9660 has support for something that resembles hard links - IE, a file can exist in multiple places in the directory structure, but always point to the same underlying data blocks on disc.
I think XISO is derived from ISO9660, so may have the same properties?
Definitely could be a factor; I know of a programmer who works at a Dutch company that mainly does ports of AAA games (he may be on here too, hello!), he once wrote a comment or forum post about how he developed an algorithm to put data on a disk in the order that it was needed to minimize disk seeks. Spinny disks benefit greatly from reading data linearly.
As a previous game dev. It was a combination of that. Also, you're often starting a project on tooling that is bleeding edge and that you have no experience with and isn't properly tested or documented.
Then game dev was always full of fresh junior devs with tons of energy, ideas and dreams, but who are coming from home brew where things like reliable, beautiful, readable code are unnecessary.
And tons of things get missed. I keep hoping that the one published game I have was accidentally built with debug symbols in it so it can be easily traced. Two of us on the project were heavily into performance optimization, and I absolutely remember us going through compiler options, but things were crazy. I remember one major milestone build for Eidos I was hallucinating badly when I compiled and burned the CD because I'd been working for three days straight with no sleep.
And passion to deliver. Engineers will kill themselves for a game release for no extra money and far less salary than their abilities would demand at a bigcorp. But they love it so they do it, and hack as best they can, to get their art into the world.
I bought into that scam. My dream job was video game dev. I spent my whole childhood writing video games. It was how I ended up doing it professionally for near zero money and 80 hour weeks and burning out completely in two years.
Yeah I also had this thought. Despite popular belief, the film look is not magic. Especially since the article mentions they already had monitors calibrated for the film look during development. Why couldn't they replicate that for the digital release? The digital release it so blazingly different, I have a hard time believing that is the best they can do with modern colour pipelines. I mean the colour grading is completely different!
Surely either it was intentional or a fairly simple case of poor colour management.
My hobby photography portfolio site
reply