Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> because it's so obviously a bad idea.

Agreed. The history here is compatibility with C type conversion.

I just expected a more compelling Rust /C++ comparison but we got an emphasis of a poorly designed feature which the standard has taken steps to improve already.



No, implicit conversion is a deliberate C++ feature and no analog existed in C. Like a lot of awful things about C++ this is their own choice and it's frustrating that they try to blame C for their choices.

In C++ when we define a class Foo (a thing which doesn't exist in C) and we write a constructor Foo(Bar x) (which doesn't exist in C) which takes a single parameter [in this case a Bar named x], that is implicitly adopted as a conversion for your new user defined type and by default without any action on your part the compiler will just invoke that constructor to make a Bar into a Foo whenever it thinks that would compile.

This is a bad choice, and it's not a C choice, it's not about "compatibility".


My copy of K&R disagrees with you that implicit conversion didn’t exist in C. See section 2.7 “Type conversions”.

   “Implicit arithmetic conversions work much as expected. In general, if an operator like + or * that takes two operands (a binary operator) has operands of different types, the ``lower'' typekis promoted to the ``higher'' type before the operation proceeds.”
It goes on including giving a table of all the conversions and a set of heuristics which normally apply.


Where do you see the disagreement?

Numbers are implicitly converted to the “largest” and perhaps “floatiest” representation.


> No, implicit conversion is a deliberate C++ feature and no analog existed in C

No.

> it's not a C choice, it's not about "compatibility".

One of the design of C++ classes is that you can create a class as powerful as int - you can’t do that without implicit conversion.


It would have been perfectly possible - not to mention obviously better - to make people actually write what they meant when defining the new type and this has no impact on the dubious priority of being able to make your own int type by gifiting your type implicit conversions in use if that's what you want which it often is not.

This is just another thing on the deep pile of wrong defaults in C++.


Yes, it’s a wrong default problem.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: