Hacker News new | past | comments | ask | show | jobs | submit login

DoXMark is probably the only website that has a proper methodology for reviewing mobile phone cameras.

They are consistent, very indepth, have objective scoring metrics and haven't shown any bias towards any brand or maker.

The iPhone 7 plus isn't an easy phone to evaluate because it has dual cameras, reviews of other phones with dual cameras also tended to lag and those were simpler where the 2 cameras were identical and were used to speed up or add additional digital zoom whilst the iPhone 7P uses 2 different cameras.

The iPhone 7 review was up 20 days after it was announced, and effectively about 9-10 days after it has became available in Europe.

It's pretty laughable that pretty much the only objective source out there and the only one that has bothered to take a scientific approach to evaluating the quality of cameras for mobile phones is accused of bias and being a paid lip service just because their current methodology is not compatible with the iPhone 7 Plus not to mention considering that iOS 10 also lacks support for certain modes for the 7P, and they want to actually give it a proper review, which means put it in cases where both cameras can be measured objectively.




I am not saying any of your arguments doesn't hold up and the iOS 10.1 upgrade of course also crossed my mind. But you have to understand that Google got early access to it's review results, permissions to use them in their keynote and to also use competitors results.

This is not free, even if only they received services or free hardware for it in return. Once you start doing that there's no return.

Because they could have reviewed the iPhone 7 Plus using the iOS 10.1 betas. And what about the final number? Isn't the difference between 88, 89 or 90 also up to the personal opinion of the reviewer? How are bokeh foto's and true zoom weighed? If it is 88 instead of 89, how much did the Google bribe influence it? It's still a possible deviation within their rigid testing methodology depending on the reviewer's opinion.

I see a lot of websites that are clearly sponsored in one way or another. Samsung does it, Lenovo does it. I think Samsung and Lenovo products aren't bad but I know they spend a lot of money on massaging online review sources.

Dxomark didn't give me that "might be sponsored in a way" feeling, now I just don't know.


The company's business model is doing testing and publishing (and selling results) the results (as well as selling actual cameras for the iPhone and image processing software).

Their methodology and scoring algorithms are open.

https://www.dxomark.com/About/In-depth-measurements/DxOMark-...

They have evaluated 1000's of cameras, lenses and sensors, and all of a sudden they are biased because Apple doesn't ship pre-production units to get reviewed?

This is beyond pathetic, I love my iPhone but I never considered it to be the king of anything, even as far as the camera quality goes, it was overall one of the best rounded camera's but for quite a long time it wasn't the king.

The iPhone 3/3GS had pretty shitty camera's, like god damn awful, things have started to pick up with the 4 and 5 and the 5 was probably the only period where the iPhone might have been more or less uncontested, but since the 6 there were phones that were getting very close and even beating the iOS devices at certain aspects of the camera, and with this generation things are pretty much dead even as far high end devices go.

P.S.

It's quite likely that their full reports are available to purchase indirectly, this is how many of these niche firms work, this doesn't mean that there is bias, it's just the nature of these things these reviews are extremely expensive.


> because Apple doesn't ship pre-production units to get reviewed

You're guessing just as much as me now. The only fact is that Google definitely had to pay to get these results and that review exactly at the moment of their keynote.

And it was definitely convenient for their slides to not have to show a iPhone 7 Plus score in it. It wouldn't have looked so suspicious if they just were proud about their own score on itself.


>The only fact is that Google definitely had to pay to get these results and that review exactly at the moment of their keynote.

How is that a fact?


Dude, you're not guessing: You're talking about your butthurt hypothesis as if it were established fact. E.g.:

> how much did the Google bribe influence it?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: