Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Python has brought computer programming to a vast new audience (economist.com)
252 points by leonagano on July 19, 2018 | hide | past | favorite | 261 comments


The thing I love most about the language is its conciseness, on several levels.

It's syntactically concise because of its use of whitespace and indentation instead of curly braces.

Expressions are often concise because of things like list slicing and list expressions.

It can be linguistically concise, because it is so easy to fiddle with the data model of your classes, to customize their behavior at a deeper level. For example, it is so quick to design a suite of classes that have algebraic operators, which leads to elegant mathematical expressions of your custom GUI objects for example. But you can also change the way classes are instantiated by overriding Class.__new__ which I've used to make powerful libraries that are a joy to use.

Further elegance can be added through creative use of the flexible argument passing machinery, and the lovely way that functions are also objects.

Your application architecture tends to be concise because there are so many existing libraries, you can often design your application at a more interesting higher level. For example, you don't have to worry about the size of int's.

My big regret with Python 3.0 is that the scoping rules in list expressions changed to prohibit modification of variables in surrounding scope. This degraded the utility of list expressions. The work around is so long-winded.

Besides the odd gripe here and there, the language really gives me a warm glow.


"Besides the odd gripe here and there, the language really gives me a warm glow."

That's a perfect summary of my feeling towards it, too. If the PSF puts up a GoFundMe for sending Guido on a nice vacation of his choosing, I will be happy to chip in a few bucks. Same goes for Linus when he hands over the reins, and a few other maintainers who probably cannot be adequately compensated for the impact of their work.


I agree. I came to Python in 2000, after Perl (1996-1998), PHP (1999), Matlab (1992-1996), C++ (1990), Fortran (1989), C (1988), Pascal (1988), Assembly (1984-86) and Basic (1981-83).

After all those, I found Python, as the saying goes, really does "fit your brain" [1]. Since discovering Python 18 years ago, I have looked at Haskell, successfully avoided Java, and promised myself to "one day" do more JS. Still, Python remains 95% of what I do, and I don't see a problem with that :)

[1] "Python fits your brain" was the theme of the 9th International Python Convention in 2001.


> "Python fits your brain"

For me, I find using python is like touching both data and code with my hands.

As much as it's trendy to bash Object Oriented languages, the way python allows to lookup all variables and method on a class or an object makes it like using your hands:

dir(MyClass) help(MyClass) dir(my_object) help(my_object)

etc...


If you'd like to go further along that path, and I'd really recommend trying to learn a lisp. How to Design Programs[0] is good gentle introduction if you're new to the ideas of lisp or functional programming, and SICP[1][2] is the old classic.

[0] http://www.ccs.neu.edu/home/matthias/HtDP2e/index.html

[1] http://web.mit.edu/alexmv/6.037/sicp.pdf

[2]https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PLE18841CAB...


Which one though?

I have looked at LISPs, but it's difficult to pick one to invest time in. Because I am so restricted on time I have to stick to languages which are used in the industry (so my fun has real-life ROI), which I think boils down to CL which is often touted as old-fashion and its market seems to shrink slowly, and Clojure which means the JVM and mainly Java libraries and high mem requirements.

For pure fun and practicality, Racket does seem up on the "list", but not used at all in the industry. A lot of languages (Python, Rust) are adopting the good parts of lisp (except for the minimal syntax, sadly), and I'm getting better mileage from spending time learning and becoming better at those than starting from scratch in a language that I won't be able to use during work hours.


So, I really like How To Design Programs, because the thing it got across for me is how to appropriately think in a functional way. It walks you through building data structures and full programs in a really nice method. I've now been going through SICP, and nothing has seemed out of place as the notation is almost identical, and the few places that it isn't are not hard to translate.


My question was more about which program language, which Lisp to choose in 2018?


Oh, so How To Design Programs uses "teaching" languages, which are sub-languages of Racket. Going from that book to learning the full features of Racket has been fun.


Totally the other way around for me. Perl fits my brain. Python I can't comprehend.


Ask Naomi Ceder who is the chair of the PSF. She's super nice.


It's also nicely self-contained, especially if you use a distribution like Anaconda. You don't have to spend hours fighting with the configuration and tooling. It's also very easy to debug and step through code using Spyder, which is similar to MATLAB. It makes it easy for someone who is smart and generally competent with computers to be productive in a short amount of time without first amassing huge amounts of arcane knowledge-- knowledge that isn't actually helpful to making working code!


I've found it very difficult to "self-contain" python projects that are even of modest size. Anaconda is a mediocre solution that is the best among other terrible options.


agreed with all the posts above, the only real letdown for me was always the difficulty to produce standalone executables. I would really love a python equivalent to uberjars or go's build tool.


I recently used PyInstaller to wrap Black (the code formatter) and it worked perfectly in two minutes. Literally one command to take the whole script and produce a standalone executable. There's also something like this: https://build-system.fman.io/


The thing about pyInstaller is that it works absolutely perfectly... until it doesn’t. In which case you are stuck spending days on trying to kick everything in your project into shape with the hope of it working.

That is what I hate most about Python, there is only one sure-fire way of doing packaging and that’s setting it up even before you write your first line of code and the testing constantly as your project evolves.

Myself and others have absolutely no trust in the several build pathways because they all fail horrible for a wide range of reasons. To the counter, if I have a C++ or C# project I’m very confident that it will almost always be a breeze to package it up.


This was circulating around HN a few days ago: https://github.com/facebookincubator/xar


CORRECTION: My regret was referring to a problem where this code worked in Python 2 but not Python 3:

    x = 100
    y = [x+i for i in range(2)]
In Python 3 I kept seeing

    NameError: name 'x' is not defined
However, after working around this problem with fixes like this -

    x = 100
    y = (lambda x=x: [x+i for i in range(2)])()
    
I have just discovered that the different scope treatment was due to a problem with IPython when using it in embed mode, for example by calling it with the following command:

    $ python -c "import IPython;IPython.embed()"
It goes away when you call ipython directly

    $ ipython

I mistakenly attributed this problem to Python 3 for years, I guess I introduced the problem because I had a different way of invoking Python 2 and Python 3 - yeesh.


> My big regret with Python 3.0 is that the scoping rules in list expressions changed to prohibit modification of variables in surrounding scope. This degraded the utility of list expressions. The work around is so long-winded.

Ironically, the PEP to add this to py3k is what's causing all the fuss in pythonland lately.


Actually I think it was 'fixed' as a syntax change in Python 3, https://docs.python.org/3/whatsnew/3.0.html#changed-syntax "List comprehensions no longer support the syntactic form..."


He's talking about something a little different. Python is introducing assignment expressions (vs assignment statements).


> My big regret with Python 3.0 is that the scoping rules in list expressions changed to prohibit modification of variables in surrounding scope. This degraded the utility of list expressions. The work around is so long-winded.

Can you elaborate? I'm not sure what this is referring to.


In Python 2.x the loop variable is part of the containing scope:

  >>> a
  Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
  NameError: name 'a' is not defined
  >>> [a*2 for a in (1,2,3)]
  [2, 4, 6]
  >>> a
  3
In Python 3.x, it is not:

  >>> a
  Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
  NameError: name 'a' is not defined
  >>> [a*2 for a in (1,2,3)]
  [2, 4, 6]
  >>> a
  Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
  NameError: name 'a' is not defined
It is sometimes useful to know why something failed. The following Python 2 code will not work in Python 3:

  >>> import math
  >>> try:
  ...   values = [math.sqrt(x) for x in (1, 2, -3, 4)]
  ... except ValueError:
  ...   print("cannot compute sqrt(%r)" % x)
  ...
  cannot compute sqrt(-3)
I have mostly found this feature of Python 2 to be useful on the interactive shell, when trying to diagnose what caused an error in my list comprehension. Eg.

  >>> with open("tmp.dat") as input_file:
  ...   fields = [line.split()[3] for line in input_file]
  ...
  Traceback (most recent call last):
    File "<stdin>", line 2, in <module>
  IndexError: list index out of range
  >>> line
  'one two three\n'
Oops, there are only 3 columns, not 4.


When you use iPython you can just drop into the context of the error with %debug after it occurred, and then print the line there. No need for the bleeding scope on the language level.


In Python 3.8 you will be able to use assignment expressions:

    [a := x*2 for x in (1, 2, 3)]
See PEP 572 -- Assignment Expressions: https://www.python.org/dev/peps/pep-0572/


I’m the same, I would never use that side effect in production code but I do miss having it in ipython though my use is slight different.

I often did:

    >>> [d for d in drawings]
    >>> [d.<tab> ...]
Because I’m lazy and I want the autocomplete to be there for me. It’s actually the only thing I miss from python 2 and I will admit that a couple of times the sideffects have caught me out in production code so I’m happy with the change.

It’s worth noting that the latest python pep is going to give you a way to assign the variables in the outer scope again:

https://www.python.org/dev/peps/pep-0572/#simplifying-list-c...


In Python 2, "[x for x in range(10)]; print(x)" prints 9. x is available outside the list comprehension.

In Python 3, it raises a NameError, because the x in "print(x)" is not defined. It's contained in the list comprehension.

But I don't know a good use of the Python 2 behavior.


I went through a course in python about a decade ago and I enjoyed it! Then along the way, I picked it up here and there. Then used it at work for a few months to sort through a lot of data.

Recently I started trying to learn swift through the Stanford course. And the curly brackets confused the heck out of me. They still do! They just seem to pop up out of nowhere. All to say , I completely agree with your second paragraph. Python is very concise.


Curly brackets fulfill the exact same function as indenting a code block in Python - it defines scope and ties code together.


Usually braces define the scope (or lifetimes) of variables.

C/C++/Java use them quite a bit.


do you know an ML language like sml, ocaml, or f#? if not, you should really give one a go, especially f#. the conciseness and cleanliness is just as good if not better than python, and there are far more options for data modeling and expression.


> But in the past 12 months Google users in America have searched for Python more often than for Kim Kardashian, a reality-TV star.

This seemed so unlikely that I thought the Economist had incorrectly used the Google Trends comparison tool, but they are right:

https://trends.google.com/trends/explore?geo=US&q=%2Fm%2F05z...

The gap is even bigger when you consider a worldwide audience (i.e. one less focused on American gossip):

https://trends.google.com/trends/explore?q=%2Fm%2F05z1_,%2Fm...


To be fair, a single python dev is going to search for queries about python dozens of times per day. Even the most ardent Kardashian fan is unlikely to search for Kardashians in any volume even close to that.


If we're using Google to approximate mind share them it's skewed way farther than that. Nobody who follows the then is sitting doing Google searches. They're already following them on Snapchat, Facebook, Insta, and Twitter.

You only get searches with breaking tmz news.

... I say this with detailed observations of my girlfriend's media habits.


I find your lack of dedication disheartening.


I put in Beyoncé (with or without accent) and Kylie Jenner, and Python still looks higher on most days. Surprising.


Someone else said this on HN: Python is the second best language in almost every field of computing.

That's why there's a library for everything in Python.

It's limitations are also not terrible for most people. If you want super fast, you will know how to do that some other way. If you just want simple and easy to read, you use Python.


> Someone else said this on HN: Python is the second best language in almost every field of computing.

It isn't. Frankly, it isn't even top 3 in any "field of computing" depending on how you define "field of computing".

What python has going for it is the evangelists who do a great job of spreading the word and of course funding - they have lots of money backing the language. Also, being an interpreted language, it has a much lower barrier to entry.

I'm not much of a fan of python and its block syntax - I very much prefer braces or markers delineating the beginning and end of code blocks.

Also, python wants to be everything ( functional, OO, imperative, etc ) and hence is the worst language to learn programming language topics in my humble opinion.

My hope is that python will serve as a "gateway drug" to other languages like ruby, ML, C/C++, SQL, etc. So that people will learn what programming and languages are rather using python as a glorified calculator which I'm sure many are doing.


"My hope is that python will serve as a "gateway drug" to other languages like ruby, ML, C/C++, SQL"

Your inclusion of SQL in that list really shows you have no idea what you're talking about.


SQL certainly is a programming language. It is a member of the declarative family of languages (eg. like Regular Expressions, or Prolog), rather than an imperative/procedural one.


SQL is immensely successful.


Yea, so is HTML, but if someones list of useful programming languages went C#, HTML, ruby, SQL. You would have to admit too that you would consider that person very inexperianced. It’t like listing a blender in a list of your favorite power tools. It’s not that it’s not useful for what it does, it just doesn’t belong in that particular comparison.


SQL stands for standardized query language.


It was originally SEQUEL (Structured English Query Language) and then changed to SQL (Structured Query Language) due to acronym conflict with Hawker Siddeley's trademark.

https://en.wikipedia.org/wiki/SQL#History


>> Also, python wants to be everything ( functional, OO, imperative, etc ) and hence is the worst language to learn programming language topics in my humble opinion.

I think it doesn't get the functional/imperative parts right as they seem to be patched onto it. It is just a scriptable, OO language and pretty good at it, when combined with it's batteries included et al.

OTOH, Common Lisp is everything and is great at it. IMHO, a language with only paradigm is very restrictive and rarely helps when solving real world problems as they can't be effectively modeled in one paradigm only.

That is why CL is just a joy to solve problems with. So is Python and anything else if that one major paradigm that it supports is a right fit for the problem at hand.


What sort of imperative flow do you find awkward in Python?


Some people have good taste for technologies and others do not.


Lots of money? Lots of unpaid volunteers! Compare with Java, Go, JS, Swift, even Rust!


how do you define "field of computing"?

It's taught in most beginning CS courses, has wide adoption for scientific computing and business. It seems like most apps are prototyped in R/Python before functions get ported to c or parallelized with dask/spark/scala

python was never intended to be a functional language, but it has lambdas. ruby and C++ are both "functional, OO, imperative" in the same way.

C with classes and first class functions is probably the best language


Yep, a truly Jack of All Trades, Master of None. And this is fine.


This is likely a retread of the old saying that the WordStar commands were everyone's second favorite key bindings - http://www.mischel.com/diary/2005/08/22.htm .

For that matter, I remember an episode of Magnum PI where the person who hired Magnum had talked to a few other PIs, and hired Magnum because he was always second on their lists of good PIs.


Read the comments here. Look at the diversity of opinions and the number of them that, while professing love for Python, directly contradict each other. Python is verbose|concise, simple|advanced, disciplined|experimental, modern|classical.

It goes to show: something doesn't need to be actually-better. It doesn't need to be actually-simpler. It doesn't need to make your job easier or better. It needs to make people think it does that.

Python is not, in any way, revolutionary. Several contemporaries exist with similar design constraints. But Python persists because it's created a community that considers itself hygienic for avoiding any other aspect of programming. And by doing this, it can be superlative in any aspect it wants to be. It simply conjures the worst example from any given competitor and those are, by proxy, pushed onto all of them.

Did we see a glowing review of how Javascript did this? No. Javascript, despite being the most deployed programming environment family in human history, didn't do this. Python did. Because it says so, over and over.

Even if you disagree with that assessment, I think Python has had a lot less to do with pushing python to new audiences. Its community has projected it outward with an impressive degree of vigor over the years, forcing it on more and more people.


Not all of those are contradictory. For example, Python mostly stays away from cryptic syntax and implicit behavior, making it verbose, but it finds powerful ways to express common patterns, making it concise. You get a small amount of readable explicit code. The words are antonyms if you pick their meanings carefully, but they don't have to be.

Python's design isn't revolutionary, but it's good and the language happened to become popular. So it's not just well-designed, but it also has all the libraries and resources and network effects that you don't get just from having a well-designed language. I'm sure a lot of it was dumb luck, but I like the result.

JavaScript's design had much less impact on its success than Python's design did on Python's success. It was doing the right thing in the right place at the right time, and how it did that wasn't all that important - once it turned into a web standard it wasn't going away.


> Not all of those are contradictory. For example, Python mostly stays away from cryptic syntax and implicit behavior, making it verbose, but it finds powerful ways to express common patterns, making it concise.

Unsurprisingly, I disagree with this opinion. Python code is typically nearly as much as a slog as Go code, but adopts all the brutally hard parts of the common lisp's functional world by only exposing wrap-primitives.

> JavaScript's design had much less impact on its success than Python's design did on Python's success. It was doing the right thing in the right place at the right time, and how it did that wasn't all that important - once it turned into a web standard it wasn't going away.

The Python community would have us believe this. I think it's a profoundly biased and one-sided reading of the history that paints a bullseye around the target. Python's primary innovation is a community that doesn't embrace differences and growth, but rather hates and hazes it. And when change finally does happen, it happens in the way most likely to cause everyone heartache, as if to fulfill the prophecy the community made.


> And when change finally does happen, it happens in the way most likely to cause everyone heartache, as if to fulfill the prophecy the community made.

I assume you're referring to the 2 to 3 split? Which happened once and every other change has left code in an completely runnable state. Unless you're talking about the C-API then, yeah, it's a PITA to keep it current across versions.

Just today I learned about f-strings and previously not knowing about them being added to the language caused me zero heartache -- I could accomplish the exact same things using the other string formatting forms since they still work as designed.

Aside from the 2 to 3 thing I'd be interested in a concrete example of a change which caused everyone heartache because, honestly, I've never seen it and I (halfass) follow python development.


I was referring to that specifically and it's awful.


Success and popularity of a programming language is mostly determined by the platforms it supports and the libraries it has available.

Objective C and now Swift is popular because you need to use one of them (without going to great lengths to avoid it) to write iOS apps. Javascript is popular because you need it to write web apps.

C and C++ are popular because they run almost everywhere, and are the implementation languages of many operating systems and software platforms.

You need Java to develop Android, and it has an incredible amount of libraries for server side web application development.

In a similar vein, TensorFlow and Numpy/Scipy makes Python the de facto language for machine learning, "data science" and scientific programming. It also has an amazing amount of libraries for web application development, text processing, and many other common programming tasks.

Libraries and community/popularity create a positive feedback loop. Large communities create more high quality libraries, which draw in more developers, which create more libraries, etc.

All of this is to say, I don't think much of your observations about Python's design and technical qualities and evolution have much bearing on Python's success one way or another, or for any other programming language. By design or accident, Guido managed to create a community that wrote a lot of useful libraries for many different kinds of tasks, that led to the positive feedback loop leading to Python's current popularity. "Forcing" is not an appropriate term for describing this phenomenon, I think, more accurate to say they have attracted a large community by making Python useful for many different tasks.


> It goes to show: something doesn't need to be actually-better. It doesn't need to be actually-simpler. It doesn't need to make your job easier or better. It needs to make people think it does that.

I don't agree with that takeaway. You ask us to consider all the diversity of opinions here, but then you evaluate them out of context, what one person says about Python is not necessarily in reference to the metrics that another has laid out, hence, they don't "directly contradict each other".

To claim that "oh you all just think Python is less verbose than Java, but it's all hype and no fact" is utter absurdity. You seem to have had bad experience that make you think Pythonistas are the tech world's Jehovah Witnesses, when plenty of other programmers will make the same case against Javascript (especially in the Node.js context).

Why do you think Python, or any programming language, has to be "revolutionary" -- or engage in cultmindthink -- for it to become popular?

edit: added "or engage in cultmindthink"


> Why do you think Python, or any programming language, has to be "revolutionary" -- or engage in cultmindthink -- for it to become popular?

I don't think it has to, and I didn't say you HAVE to. I'm saying that you can accomplish the goal with a community that prides itself on ignoring things. Python's community is even more insular and ensconced in their opinion than editor proponents. Most Vi proponents can tell you about how they know Emacs and hate it (and vice versa). In the Python world, not knowing these things is in fact often a purity test, with GVR doing some kind of classic, "I'm just a hyperchicken lawyer" gag reel in public while in private engaging in complex debates about the very concepts he claims are inherently not understandable.


I guess I don't understand what your point is, I even less understand how you measure it, as you don't even offer up a particular language feature or PEP in which Python's purported ignorant groupthink has influenced, or even an outlandish personal anecdote ("This one guy tried to convert all our codebases into Python...").

I guess I can't personally relate since I came from Python after C/JS/Ruby/PHP and have done so not out of delusion or hype. Maybe your perception is based on having met a lot of non-erudite Python developers. Which wouldn't directly contradict the basis of the submitted article, which is that Python is an incredibly popular language, i.e. one that attracts the masses.


Isn't it amazing how an insular group managed to independently come up with Haskell-like list comprehensions, and even give them the same name?

And how they managed to develop a unit-test framework for the standard library which coincidentally happened to be similar to the xUnit architecture?

It's a miracle that a group so dedicated to ignoring things even heard about the 'await' and 'asynch' keywords for asynchonous programming - https://www.python.org/dev/peps/pep-0492/#why-async-and-awai... - much less implement them.


> Isn't it amazing how an insular group managed to independently come up with Haskell-like list comprehensions, and even give them the same name?

That's the big irony: The very same things the senior community and GVR pan are in fact their bread and butter. They're smart and informed. But the engender an attitude of willful ignorance in their community.

GVR is smart. I've been on a pretty intense private mailing list with him. He can competently talk to both advanced functional programming and OO programming. He can talk about advanced compiler and gc subjects. He just doesn't in public, because that'd weaken the message and persona he adopts in public.


To the contrary, I cannot reconcile your statement about their insularity with the observed behavior.

It's almost as if you are wrong in your interpretation.

It's easy to prove me wrong - where has van Rossum panned list comprehensions or the new async/await features?


> It's easy to prove me wrong - where has van Rossum panned list comprehensions or the new async/await features?

Since I never claimed he did, I struggle to provide offhand examples.

See: every discussion about lambdas discredited every day by the existence of a robust self-taught javascript community.


Are you being deliberately obtuse?

You said Python developers as a whole were insular and as a whole it "prides itself on ignoring things".

I listed three different things which Python obviously got by looking to other languages. This is neither insular nor ignorant.

How do you reconcile your statement with my observation?

You replied with "The very same things the senior community and GVR pan are in fact their bread and butter."

I presumed this had something to do with the three items I listed, since that's how conversations are supposed to work.

Van Rossum has been pushing for async programming since at least Tulip, so it was certainly part of his 'bread&butter'. Thus, async/await are in category of things you claimed that he panned.

Now you say you didn't claim that?

Are you really just so pissed off that van Rossum doesn't like lambda that you want to besmirch the entire Python development community?


I believe SQL is "the most deployed programming environment family in human history". SQLite is seemingly everywhere.

Your second paragraph does not follow from the first. It could instead be that HN commenters are poor at this sort of analysis, or that people in general are poor at this sort of analysis.

What were the contemporaries with similar design constraints? I started looking at Python in 1995, and the major alternatives were Perl and Tcl. I don't remember anything close than those to Python.

I agree that Python is not revolutionary. I don't agree with your statement "avoiding any other aspect of programming". Van Rossum's ABC experience strongly influenced his ideas in how programming should be accessible to non-professional programmers, and that type of design is certainly an aspect of programming, though often overlooked.

I really don't understand "forcing it on more and more people." In some sense it's true - Python is a widely used teaching language, and as such the teachers are forcing ever more students to learn it; Python is the de facto required language for a number of fields, so again people are 'forced' to use it because their tools are often in Python.

But I don't think that's the type of force you mean?


> I believe SQL is "the most deployed programming environment family in human history". SQLite is seemingly everywhere.

How many computers have a browser on them vs an interactive and instructable version of an SQL server?


You'll need to define some terms here. What is a "programming environment family"? Does BASIC in its various forms count? That includes Visual Basic and VBA.

Do people need to use it, or could it just be sitting there, like BASICA.EXE on an old MS-DOS distribution?

For that matter, the DOS shell is still hanging on, and it's a programming environment.

Most people use browsers on a smart phone or pad, with (as I understand it) no access to a programming environment.

SQLite is not a server.


I think that BASIC might actually be the dark horse I haven't considered here! Nice! I wonder if the fact that it isn't shipped on the other platforms evens that out though.

I just pulled out my surface book and Basic doesn't seem to be on my command line. I wonder if that's unique.

> SQLite is not a server.

Did I imply it was? SQLite is not a programming environment when shipped, either. An SQL server with a sufficiently large editor buffer might be able to squeak by as one.


Most of those web browsers embed SQLite:

https://www.quora.com/Which-browsers-support-SQLite-How-univ...

Which I think is available interactively through Javascript? (To lazy to research further right now.)


>>I believe SQL is "the most deployed programming environment family in human history".

Actually that is Microsoft Excel.

Both SQL and Python will never come close to the ubiquity Microsoft Excel has.


I gather the exact opposite opinion of you: when something is aesthetically pleasing people will come up with rationalizations for why they like it, even if the real reason is difficult to describe. Similar to how in a focus group you can judge people's overall opinion but you should not focus on their specific feedback.


I am involved in many, many language communities. I've shipped software to production audiences of varying sizes in C, C++, Perl5, Common Lisp (Allegro), Common Lisp (SBCL), Ruby, Python, Scala, Ocaml, Java, JavaScript, Clojure, Haskell, F#, Typescript, Purescript, and Pony (finally!). I've used many more languages for non-trivial projects, including some now some defunct but really beautiful contenders like Nim, Io, Pike, Perl 6, Factor, Elisp, Vimscript, Coq and Idris. That's off the top of my head, I'm forgetting at least half a dozen more.

Python's community certainly feels distinct to me in this way.

But this is, of course, subjective. Your mileage may vary.


To avoid confusion: Perl 6 is not (anymore) on the defunct list, it is still on the "beautiful contenders" list.


Uh? Is Nim defunct? I wouldn't say so!


Yeah I miswrote it. I meant to phrase it differently. I love most languages on that list and didn't see my on-mobile editing had caused me to say Idris is defunct until it was too late. :(


A Javascript array comprehension [0] in theory is similar to a Python list comprehension [1]. However, the slight difference in syntax makes Python dramatically easier to read and write. Javascript dropped the feature. Pythonistas use it often.

Many programmers, even Python core devs, don't understand what makes one syntax more readable than another.

[0] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

[1] https://docs.python.org/3/tutorial/datastructures.html#list-...


"However, the slight difference in syntax makes Python dramatically easier to read and write"

Please prove it.


The evidence is in the usage. As I said, Javascript decided it was a bad idea, despite its popularity in Python.

Your phrasing is a little aggressive, so I'm guessing this is one of those discussions where there's no real chance to change your mind, but I'll give it a shot.

Consider one of the most friendly languages, SQL. Python comprehensions follow the same pattern as SQL.

    SELECT expression FROM table WHERE condition
    [expression for name in sequence if condition]
Javascript put the expression at the end.

    [for (name of iterable) if (condition) expression]
Putting the expression first makes the language more declarative. The easiest languages -- SQL, HTML, etc. -- are declarative, not procedural or functional.


> The evidence is in the usage...

It's possible it was not adopted because there's a cost involved in every feature that has inconsistent support across runtime. The benefit needs to outweigh that cost.

> Putting the expression first makes the language more declarative

The case for a specific order makes less sense in a declarative language, not more.

And:

[1,2,3,4].map((x) => x * 2)

Doesn't seem substantially more or less declarative than:

[x * * 2 for x in range(1,5)]

to me. If anything, the latter example asks more of the reader because it requires an awareness of case-specific semantics.

And overall, it's simply never been clear to me what comprehensions buy in terms of readability or semantics over map/reduce/etc, and given JS cultural buy-in on some imported concepts from FP, my guess is that the marginal benefit just didn't seem worth the cost rather than it being a misstep in the specific proposed JS syntax.

Also: SQL has its points, but it's not a language I'd generally describe as easiest. Declarative DSLs tend to be largely convenient only in their domain. It may be worth admitting SQL to this discussion I suppose given the overlap between its domain and map or foreach like concepts, but is it really strictly easier to reason about how to do something this particular thing in it (and what it's actually doing) than either Python or JS? I don't think that's as straightforward an argument as you seem to have assumed.


Listen to how people who don't program describe the task of transforming each element of something. In my experience they often say something along the lines of, "I want y for each x in z," rather than, "For each x in z, I want y." The latter is more common among people who've been programming in certain languages, not the average person. I don't have systematic data collection for this, just personal experience. But the Javascript decision adds some evidence, in my view.

> it's simply never been clear to me what comprehensions buy in terms of readability or semantics over map/reduce/etc

Ask some random person off the street to read your source code. They're more likely to get the gist of code using comprehensions than code using map/reduce/etc.

> SQL ... convenient only in [its] domain.

Which is exactly the domain we're talking about when discussing comprehensions.


> Listen to how people who don't program describe the task of transforming each element of something. In my experience they often say something along the lines of, "I want y for each x in z," rather than, "For each x in z, I want y."

If we're really at the point where we're arguing between which clause order (among other potential variations in expression) is more natural across most natural language speakers, that should about wrap it up.

But to the extent that it doesn't, this assertion has the merit that it sounds like an empirically verifiable position.

> Ask some random person off the street to read your source code.

Which is worth considering, I suppose, in situations where the relevant software is being built by people being brought in off the street.

> [SQL] is exactly the domain we're talking about when discussing comprehensions.

"Exactly" overstates the case. As I said above, I recognize it's a related domain, but there's plenty of things that are easier to do with Python or Javascript than they are in SQL, even specifically with operations over every tuple in a set.

And again, declarative == easier is overgeneral assumption. Take Prolog, for example. Definitely domain-overlapped with SQL, often a poster child for declarative. I quite like it myself, but it's not always a hit at parties. Even hackathon parties.


> empirically verifiable

I'd love to see the data.

> software ... built by people ... off the street

Programming is such a rapidly growing field that I think the majority of software will be written by programmers with less than 1 year experience. For a time, anyway. Maybe in 20 or 30 years the growth will slow.


> The evidence is in the usage.

That's not what I asked for. You're asking me to accept a concrete proposition that I obviously don't. To sway me, I'd like proof, not a single example labeled evidence.


Proof? Describe what proof means to you.

The fact that Javascript decided not to adopt the syntax seems like good evidence to me.

I'm not asking you to do anything. I'm offering you some insights. Normally people pay me for my time, but I suppose I feel like volunteering at the moment.


The fact that python decided to use a different syntax doesn't prove that it was good either.

What you volunteered to me is a mirror image of an argument you think I presented, then suggested my version was wrong.

But I'm not actually advocating for JavaScript syntax here. I just don't think list comprehension syntax, as presented by python, is particularly better in comparison. You said it's much better, then demanded that I agree with you.


Demanded? I ain't demanding shit. Certainly not of a stranger on the internet.


> Python is not, in any way, revolutionary. Several contemporaries exist with similar design constraints.

I think it was revolutionary in its own terms. For example, list slicing was revolutionary and it reduced many lookups to the minimum. Sure, it was implemented by other (not so popular) programming languages before but not in addition of many other features like easy interpreter embedding, good reflection/metaprogramming, straightforward bindings, unicode, etc.

Now in 2018 all these things are discussable and we can objectively criticize eternal issues (e.g. interpreter lock) but Python showed that you can write applications in a concise way without a lot of boilerplate. I would love to see a programming language that progresses from this point.


> I think it was revolutionary in its own terms.

I think this is precisely what I said. If you discount any other competitor and what they did, Python was revolutionary within itself. Which is not all bad, it means the language has had progress. But it's a very narrow metric.


For best results, read in the voice of the Architect.

More seriously, Python borrowed list comprehensions and the like from other programming languages and Javascript said "Hey, that's a good idea.".


> No. Javascript, despite being the most deployed programming environment family in human history, didn't do this.

It's definitely not.


What programming environment more deployed?

It's surely the case that other binary interpreters are more deployed. But shipped interpreter with free input, a runtime, debugger and (for nearly every implementation, an interactive visual editor).


Office Macros (and the underlying Windows COM interfaces), which include all of those features if you enable it. Bash.

Desktops versions of browsers have all those things you listed. Mobile browsers do not. Also only modern-ish desktop browsers have that feature set, for a very long time the predominant browser of the day hardly had any debugging experience worth mentioning, the two I've listed have had them for a long, long time.


> Office Macros (and the underlying Windows COM interfaces), which include all of those features if you enable it. Bash.

Bash is a good one! But I think windows ships without it. Office Macros are a really good one. Excel is also really good. But I think Windows technically ships without both, whereas the current browser is technically a full on programming environment and has been there since (minimally) IE7, which was a long, long time ago.

Sorry, I did kinda cheat on the way I defined my statement tho.


Also, to be fair, I ommited that Windows ships with JScript automation, which I believe has some esoteric debugger and GUI you can access by default.

And yeah... It's JavaScript! Kind of.


I am sorry but this does not compute: how is it "forced" on audiences?


I remember hating the whitespace thing so much when I started with Python circa 2.4, but have really come to love the language in a lot of domains since then.

A huge, huge "thank you" to Mr. van Rossum for creating this quirky language that I've become so fond of and for the countless hours of his life he's given to the project.


from __future__ import braces

I hated the white-space thing until I discovered that. For whatever reason this made me get over myself and learn to love the tab/space argument that is somewhat akin to the vi/emacs argument.


I know that now. 2.4 was a long time ago... there is some serious wisdom behind that decision/PEP-8/"import this" that took me a while to appreciate. Humans beings are the weak link in developing software, and language decisions that nudge towards highly readable code are productivity decisions.


Oh, I see. I use spaces and vim (I do, but who cares). Let's start a flamewar. /s

Every time I think about tabs/spaces, I think about that episode in Silicon Valley [1]. Hilariously accurate, I have the exact same feeling when somebody uses the mouse to copy something or navigate. The cringe makes me feel alive (I guess this is the ultimate reason for those flamewars. And neurotic tendencies, of course).

[1]: https://youtu.be/SsoOG6ZeyUI


  :set ts=8 sts=0 sw=8 expandtab
I have secrets.


Go one step further and add these to you vimrc:

    command Tab1 set ts=1 sts=1 sw=1
    command Tab2 set ts=2 sts=2 sw=2
    command Tab4 set ts=4 sts=4 sw=4
    command Tab8 set ts=8 sts=8 sw=8


slight addition:

    command! Tab1 set ts=1 sts=1 sw=1
    command! Tab2 set ts=2 sts=2 sw=2
    command! Tab4 set ts=4 sts=4 sw=4
    command! Tab8 set ts=8 sts=8 sw=8
The ! exclamation mark is there to prevent vim from complaining about a duplicate command already existing whenever you reload your vimrc.


Thank you for this. I've been typing out:

:set tabstop=4

:set shiftwidth=4

:set expandtab

Like a pleb for years. What's sts?


:help 'sts

Number of spaces that a <Tab> counts for while performing editing operations, like inserting a <Tab> or using <BS>. It "feels" like <Tab>s are being inserted, while in fact a mix of spaces and <Tab>s is used. This is useful to keep the 'ts' setting at its standard value of 8, while being able to edit like it is set to 'sts'. However, commands like "x" still work on the actual characters.

When 'sts' is zero, this feature is off.

You can also abbreviate expandtab to et.


Put them in your .vimrc, and put your dotfiles in git.

Hint: searching GitHub for "dotfiles" can be a fun and productive way to spend a Friday afternoon :-)


I mean locally yes, but I'm poking around on servers frequently enough and writing python there (Hey, I'm no saint) that it gets committed to memory eventually.


Haha. When we watched that episode my wife was like, “...wait, what, is that a real thing?!” Only true computer geeks could possibly understand.


Whenever I see that clip what I’m left wondering is if there really are people that press the space bar eight times. I have no recent memory of any editor didn’t let me press the tab key and have spaces appear instead.


Yeah, that was annoying. If she had configured her editor to insert 4 spaces (or 8, but I think it's too many) when she presses the tab key, it would be more realistic and they could still be together (until he reads through her code). [sheds a tear].


I'd never seen that before!

She's obviously the smarter one: Vim and (4) Spaces ftw.


You monster. Not a chance.


Same here. I remember that one of my first attempts at trying out Python, coming from Ruby, seeing that indentation compilation error made me immediately give up. Hard to give up the freewheeling nature of Ruby but I am very glad that I did. "Use whatever style makes you happy" has extremely diminishing returns the more you code in a language.


I've grown to accept it. However, sometimes I wish we had something to replace braces like some IDE coloring schemes to clearly represent indentation levels. In larger programs it gets difficult to tell apart levels sometimes.


There seems to be a sentiment that Python is a kids language - but I think of Python as a language that scales with your experience. You can do some really interesting and complex things and I’m really appreciative of my deep understanding of the language through a decade of heavy use, open source contribution, and project ownership.

Recently I released a package which allows you to dynamically create or change function signatures and went deep into `typing` and `inspect`, and found the experience rewarding.

https://pypi.org/project/python-forge/


> There seems to be a sentiment that Python is a kids language

I only hear this from developers that have never used Python or work in problem domains where the language is genuinely not better (e.g. low-latency distributed services, embedded devices/IoT sensors).

I've scaled Python & PHP web apps to a hundred million monthly visitors on Top 20 US destinations on very limited hardware (6-10 machines). It's surprising how much you can get out of the language if you know how to take advantage of the strengths.


Even IoT sensors are easier to program with micropython.

What I find worrying is the the language is changing to suit those people, e.g. type hints, rather than teaching them learning to use the language. You don't complain you can't use a hammer as a screw driver, you just don't use a hammer.


I agree that sensors are easier with Micropython and will use that if I can (unfortunately the C++/Arduino ecosystem is much, much larger). However, type hints are a fantastic feature and I can't recommend them enough to anyone who writes software for a living. It makes maintenance at least an order of magnitude easier.


1). C++/Arduino for micro controllers ends up using more memory than micropython. I've managed to get the binary sizes to 1/4 by doing cross compiles and linking things my self.

2). Types hints are a terrible non-pythonic kludge to solve a people problem in code. People who want to write java with whitespace and no curly brackets love them. CPython has made a number of bad decisions lately that make the language less usable. I've moved quite a bit of my everyday code to PyPy. Since they eat their own dog food they are keeping their dialect sane.


Type hints are optional, why not just not use them?

How are you going to know the type of an object that gets passed to a function? Why is it better to go around the codebase looking for things instead of glancing at the function signature?


The type (class really) of an object tells you nothing about how it will behave. What you need to do is check the methods attached to the object, not class, at runtime:

1) Exist.

2) Behave how you want them to behave.

I've written plenty of code that got that done by using an assert just after the docstring.

People will monkey patch, if not in your code base, than in a library you're importing.


> Types hints are a terrible non-pythonic kludge to solve a people problem in code.

I have a love/hate relationship with type hinting in Python. I've been using them for about 2 years, and it still just feels... unnatural. I usually add enough so that my IDE (PyCharm) has better autocomplete and that any Sphinx generated docs have sufficient coverage of types I may have missed when adding @params.


I have mostly a hate relationship with them, because the people who use them do not write python. They write c# or java in python.

And you shouldn't need an IDE with a scripting language. That one of the main selling points for type hints has been "Your IDE will do things for you!" is a warning that this is a bad move. Again python is not java, and it shouldn't be trying to be java.


English is a kids language... sure you can use it for your PhD thesis and the rest of your career, but there's no disputing that a lot of kids know it too; parts anyway.


>>There seems to be a sentiment that Python is a kids language

Python is like the new Basic.

Very easy to work with, so a lot of people would be using it. Like volumes of people.


Python also gets used for some serious stuff in a way that Basic never did. CERN use it to control their systems, 'Try Ubuntu' to install a second operating system, Youtube to play videos and so on.


For some reason, I never liked Python. I don't know... its classes look weird with all those self, __init__, etc. The __init__.py for packages is also a strange choice to me. It has classes, but no interfaces. The language is even slower than PHP. Does Python have type hinting? What all the hype is about? I just don't get it.


Agree. I can't understand people praising the lightness and readability while at the same time stuffing their code with underscores. Also it seems often inconsistent. Like some functions are used as prefix (len(x)) while other are suffix (x.clear()), I assume for historical reasons. Or like the "b if a else c" which reverse the logical order (you evaluate the condition then you can tell if you go one way or another). Same for [x for x in list], the Linq approach is more logical (for x in list select x).

I understand why people like it. And I like scripting languages, it's nice to find a piece of script somewhere and being able to edit it directly having to find the correct source code and recompiling. But it has lots of shortfalls and quirks too (multi-threading, dependencies management, deployment, no refactoring/static checking, python 2 vs 3, 50 different kinds of strings, APIs full of abbreviations which probably meant something to their creators 20 years ago, etc).

It's a nice language, among others. Not an "ultimate" language by any means in my opinion.


Those are all valid gripes, but for most use-cases (where performance isn't critical), I haven't found anything better than Python yet. Ruby is an aesthetically nicer language and doesn't have those issues you mentioned in the first paragraph, but the Python ecosystem is amazing and better than Ruby's IMO. And compared to most other languages Python still feels very clean, terse, and pretty to me.

>multi-threading

Most interpreted languages have a GIL, sadly. But yes, this is annoying.

>dependencies management

Annoying, but resolved in part with new projects like pipenv, and also still a big issue for other popular languages.

>deployment

Also helped by pipenv, and Facebook's xar (https://github.com/facebookincubator/xar) is new but looks promising.

>no refactoring/static checking

It's a dynamic language. But this is now possible to some extent with type hints.

>python 2 vs 3

They did screw this up, yeah. Python 3 fixes a ton of issues and is a better language, but the backwards incompatibility created way too many problems.

>50 different kinds of strings

There are 2 kinds: bytes and unicode.

>APIs full of abbreviations which probably meant something to their creators 20 years ago

Can you give some examples? This is an issue in a few of the old standard libraries, but even then it's not too common. This is way, way more prevalent in PHP and C than in Python.


> There are 2 kinds: bytes and unicode.

No I mean single quote, double quote, triple double quotes, r prefix, u prefix, f prefix, combinations, etc

> Can you give some examples

The one I have in mind is the os library, which I use a lot (like why st_mtime?).


>No I mean single quote, double quote, triple double quotes, r prefix, u prefix, f prefix, combinations, etc

These are all extremely helpful, though. All of those quote combinations are equivalent, and can be used to prevent having to escape quotes within the string. The u prefix is gone from Python 3, r is a raw string (which many other languages have, like C#'s @ prefix), and f is a new feature which generated a lot of debate but basically allows interpolation similar to Ruby and ES6.

>The one I have in mind is the os library, which I use a lot (like why st_mtime?).

That's the one I thought you were going to bring up. I agree the naming conventions in the os module have a lot of warts. The function names could use a refresh.


You wrote "These are all extremely helpful".

I disagree, in that I think they are only somewhat helpful. If we only had double-quoted strings, and not single-quoted strings, then I think there would be little difference to the language or its quality of use.

Going back to the question of "why st_mtime?" - that's because os.stat() is a wrapper around the POSIX stat(2) system call. My FreeBSD man page defines it as:

   st_mtim   Time when file data last modified.  Changed by the mkdir(2),
             mkfifo(2), mknod(2), utimes(2), write(2) and writev(2) sys-
             tem calls.
The Python implementation of stat for MS Windows is basically doing its own emulation layer for POSIX compatibility.


Yes, Python has type hinting since 3.5[1].

https://www.python.org/dev/peps/pep-0484/


To be clear, Python has had type annotations for a while now, but the default CPython interpreter doesn't use that information to optimise performance.


The CPython interpreter does very little to optimise performance because, as the reference implementation, they strive for simplicity of implementation.


No interfaces?

    from collections.abc import Sequence
Slower?

    import numpy as np


You're referring to duck typing, which to me contradicts the Zen of Python: Explicit is better than implicit.

> Slower?

Yes, overall the language is slower than PHP. It's not a bad thing, just the result of benchmarks. It doesn't mean one should stop writing Python.


No, I'm not referring to duck typing. An abstract base class enforces the interface. Explicitly.

Some benchmarks might be slower, but I don't know anyone doing high-performance computing with PHP. I do see the topic at Python conferences.


abstract base class != interface


Let me help you two out:

ABCs are like interfaces in they can be used to enforce properties (like the presence of particular functions) of a class that extends/impliments them; BUT, interfaces can also be used to enforce a particular variable always refers to something with certain properties (without worrying about what it is concretely).


Right, they're even better than Interfaces: both a contract and a partial implementation. No need for an "Interface" when you've got ABCs.


I see what you mean, but to me abstract classes are distinct from interfaces. Abstract class answers the question "what an object is?", whereas an interface is about what the object can do.


The distinction between what an object is and what an object can do is an artifact of bad object modelling (specifically, single inheritance OOP.) Any “can do” set has an exactly corresponding “is a thing that can do” set, so there is no logical need for different constructs to model “is” sets and “can do” sets.


Sounds like a controversial statement and I'm not sure how universal true this is. I'll give it a thought. Thank you.


Another way of explaining this is that Interfaces are Java's way to compensate for the lack of multiple inheritance.

There's more than one variety of inheritance. Both Python and Java chose implementation-inheritance, meaning that the children get to use not only the interface, but also the implementation of the parent.

For some reason, I'm not quite sure why, the Java folks thought that multiple implementation-inheritance would cause too many problems. They still recognized it's useful, so they provided a 2nd form of inheritance that supports multiple parents, interface-inheritance.

There are more varieties of inheritance that a language could choose, for example, prototypes like JavaScript.


Yes, I'm familiar with all of that, but isn't the Diamond of Death [1] is the exact reason why they did it?

[1] https://en.wikipedia.org/wiki/Multiple_inheritance#The_diamo...


People read left to right. That solves all the trivial problems. The harder problems were always vocabulary problems not programming problems.

The trouble is that English, and other human languages, have never been as clear or specific as we assume they are when we start writing code. For example, what is a bird? It's a thing that flies? Oh, ostrich, penguin...

All these supposed OOP inheritance problems aren't unique to inheritance but to the translation of human language to programming. There's no need to exclude multiple inheritance, because nearly every aspect of programming is afflicted.


That distinction is unnecessary.


What other language are you comparing Python's flaws and tradeoffs to?


Java, PHP, C#, heck, even JavaScript/ES[6,7]. I feel like Python has a lot of its own ways of doing things, whereas a more idiomatic approach is imho a better way to go. Why the hell in the world would you call a class constructor as __init__? Pretty much every popular OOP language does it either by naming a constructor the same name as the class name or through some form of a `construct` notation. Why `self` and not `this`? I mean, I know that the language has a very low entry barrier, but wouldn't it make sense to design it in a more traditional way?

    if __name__ == '__main__':
        main()
Code like that just doesn't make any sense to me.


Java, C# and modern PHP have a very similar style that's not at all universal. If you're used to those, Python is odd, but it can get a lot weirder. At least Python has classes.

__init__ is called that because it initializes the object. It could have been called "__construct__", but I think "__init__" is a lot clearer about what it actually does - the object is already created when it enters __init__, __init__ just prepares it.

Calling it "self" instead of "this" is super traditional. Smalltalk did it.

The __name__ == '__main__' pattern is ugly, and I'm not a huge fan of it. You don't always need it, it's just a guard to let you tell whether a script is run directly or imported as a module. The key is that Python only has runtime. A function declaration or class declaration is a fancy variable assignment. The content of the file is executed sequentially.

Python is not especially weird, but it is different from what you're used to. Wrapping your head around it takes a while, like with all languages, but things start to click eventually.

It helps to experiment. The REPL is great for that. If you don't understand why something is the way it is, mess with it until you think you understand its behavior, then mess with it some more.


> whereas a more idiomatic approach is imho a better way to go.

I don't know what you mean by "idiomatic approach". That phrase, I thought, refers to languages that are highly opinionated about using a specific style -- this philosophy is encoded in Python's philosophy:

https://www.python.org/dev/peps/pep-0020/

There should be one-- and preferably only one --obvious way to do it.

> Pretty much every popular OOP language does it either by naming a constructor the same name as the class name or through some form of a `construct` notation

Seems kind of arbitrary to complain about the constructor keyword `__init__` when the constructor word for pre-ES6 JS, C, Perl, Ruby varies between prototype, constructor, __construct(), and new. Not sure how you say Python's use of `__init__` is somehow worse. As for why `self`, no idea, but seems to be a pretty easy convention to figure out.

As for the complaint about the `if __name__ == '__main__':`, to which language are you referring to that has a more self-evident way of encapsulating that functionality?


I'm not talking about highly opinionated languages, but common idioms across OOP languages.

C doesn't have classes. pre-ES6 - agreed, but who writes it? I think Perl is a bad example, it's dying and is one of those highly opinionated guys, with all its `method`, `trusts`, etc.

> As for the complaint about the `if __name__ == '__main__':`, to which language are you referring to that has a more self-evident way of encapsulating that functionality?

My gut tells me that this is an error-prone idiom.

I'm not saying Python is bad, I just don't like it. I'd love to know what are the strongest sides of the language though?


For me it's somewhat strange that it is universally presented as a beginner's language. Python would be hugely confusing to me if I didn't keep the underlying C model -- dynamically allocated objects with reference counts -- in mind at all time. Together with a lot of protocol that is largely arbitrary.


I've met a lot of people who have programmed simple things in python without understanding the memory model at all so for most people it doesn't seem to be a large impediment.


Would you also find Excel macros hugely confusing if you didn't know the underlying C model? Because I think that's the level of programming that the masses are doing.


To me that would only be relevant if I'm doing super performance sensitive stuff, and if that's the case I wouldn't be using Python.

I've been programming in both C and Python for years and never once have I had to consider reference counting in Python.


When teaching Python to beginners, I struggle with the concept of mutability -- e.g. `sorted(mylist) vs mylist.sort()` without being able to refer to the concepts of memory and references. But it's not a terrible obstacle if you drill in extra adherence to logic and testing and jumping into REPL. After enough trial-and-error with `x is y` `x == y` and `id(x)`, the abstract concept comes through without having to acknowledge the existence of reference count.


Hardly a beginner at programming, but I also struggled with this when I picked up Python recently. The documentation doesn't make it obvious when functions have this sort of side effect, so like your students I play with the REPL to establish exactly what a function will do. Even then an unexpected side effect was the source of the single most perplexing bug in my program. I imagine that in complex programs this philosophy of sometimes-mutability-sometimes-not must lie at the root of a huge amount of programming error.


Can you provide an example of why you need to understand the underlying C model?

I don't think I've ever considered it when writing a Python program.


In all this time I didn’t even know python uses reference counting instead of tracing gc...


It also uses mark-and-sweep.

    a = []
    a.append(a)
Can't collect that one with ref counting.


Weird. :)


Am I the only who thinks that "90% of American parents want their kids to be computer scientists" is rather unsettling?

The market is currently full of people who are in for the money and will continue to be. It will be filled with people until supply meets demand. That the salaries will drop significantly is a natural result of this.

We had this pork cycle for teachers, doctors and other highly trained people. No one is benefitting from too many people who drive the prices down (although I'm not gonna say something as an employer, companies are happy about it - the profit margin will shrink but this can be compensated for using scaling). Those that are in for the money will leave (or continue to suffer in an unfulfilling job) and those that have passion can't sustain themselves (we can see this trend in indie game development today).


I found the source here: https://news.gallup.com/poll/184637/parents-students-compute...

I agree it is unsettling, and I'm one of those ones who studied computer science for no other reason than the American "follow your dreams" meme. There weren't a lot of kids in my CS department at school, and they definitely weren't the business school follow the money types. We have had quite a few applicants lately who can't program their way out of a wet paper bag, but we have also had quite a few very qualified ones who were rejected simply because there weren't enough open positions.

It's generally a challenging job despite the myths, and programmers seem to be good at making things worse and more complicated for themselves, thus requiring more resources (programmers) to fix things. Also, the field is becoming deep enough that there are all types of specialties. Right now, AI and cryptography specialists can probably command high salaries, while your jack-of-all-trades guys are becoming a dime a dozen


Maybe it makes sense to encourage your kids to go into blue collar jobs. In a future teeming with code monkeys, plumbers and electricians are going to be highly valuable.

As for the current career programmers, maybe we should unionize before our jobs are being taken by kids coming out of 2 week bootcamps (who have been taking programming classes since kindergarten).


In a future teeming with code monkeys, plumbers and electricians are going to be highly valuable.

As a person who repaired quite a few toilets in their life

As for the current career programmers, maybe we should unionize before our jobs are being taken by kids coming out of 2 week bootcamps (who have been taking programming classes since kindergarten).

I've seen such people(well, they started learning in primary school, not in kindergarten, but nevertheless) and two things stand in the way of this scenario:

1. Barely anybody likes this line of work. I've had people who would really use some extra cash outright refuse to learn our trade because they found it boring. Apparently Our unique skill in the job market is an immunity to boredom.

2. There's more to this job than programming. There's also working in teams which, despite the focus of pretty much every western education system on this, is something that people seem to learn only after a good few years of full-time work.

Also people fresh out of college/bootcamps tend to fall for the typical "nerd traps" like trying to build beautiful abstractions where there is no time nor need for that.


> Apparently Our unique skill in the job market is an immunity to boredom.

This is about right IMO. Most of the work is seriously boring. I was in meetings talking with software developers about API interfaces of legacy systems (with monologues about the intriguing question: "Why is the <insert stupid decision> a thing we have to stick with?" - answer is always backwards support, btw) and I couldn't understand why are they sitting here and why do they think their life is well spent doing that? Anything is better - and they earn €3500-4500/month, so it isn't even that much for their 5-10 y/o careers.

Everything is better than this. I love programming, I've started when I was 10-11y/o, but I hate most things in the software industry. I love to program my own little tools, but I've become so fast building CRUD applications and apps that I have 200-300% efficiency (and thus a very high hourly rate) but I'm simultaneously so seriously bored that I have a hard time focussing on work because it's always the same.

- - -

> I've seen such people

Me too: I've seen a kid (maybe 15-16) that has no A-level and dropped out of the 9th grade to work in a startup. Without any shares. I think he makes €10/hour maximum. I was 2 years older than him at that time and worked as a freelancer and had much more experience. Even 2 years earlier I had more skill than him. And I would never have considered to drop out of school back then because it's hilariously naïve. I can't understand how the startup could support it without any moral conflicts. Exploiting 15y/o and encouraging them to drop out is something I don't want to be involved in. But I guess this will happen a lot more in the next few years which is sad.

> Maybe it makes sense to encourage your kids to go into blue collar jobs.

Yeah, don't expect anything like a developer salary. Not even half of it.

My grandfather and my uncle have a blue collar company. They're operating with a 10% profit margin - sounds good, right? Wrong - if one of their employees is sick, they get into money problems really quick. Normally every employee needs to earn the company 2-3x of his salary to compensate for random events. They can't do that because the market doesn't allow them to - other people would do it for less than them and the quality is the same. Supply and demand will eventually lead to diminishing returns in all sectors. And if plumbing is too damn expensive, we have better incentives to automate the whole job.


I believe widespread employment as programmers is the only possible way for the US economy to have a large middle class in the coming decades. Unionized manufacturing ain't coming back.


Sorry to disappoint you, but there won't be a large middle class in industrial nations. It will be gig-based and there will be a big group of people working hard who get nothing and a small group of people who are extremely wealthy and disconnected.

You don't like Fiverr? Imagine that our whole job market runs like this in the future. The profit margin will be extremely small.

You need a website or a logo? Create a Fiverr task. You think you're a better developer or designer than those Fiverr guys? Then wait a few years and you will see that the performance and quality will improve drastically because workers get desperate when the pork cycle hits them.

- - -

I'm not dreaming this up - this is already happening! And even employees are used like replaceable commodities e.g. big agencies have so many applicants that they say to their employees: "if you don't work hard for that shitty pay, you will be replaced immediately" - the agencies are clear about that and demonstrate their power. It's soul-crushing and I've seen great designers who get exploited in those power dynamics.


It's not that I'll be disappointed, but that programming is my only hope.

I'm one of the lottery ticket winner, 1% folks. I'd feel better if things were more equal, but it'd probably be better for me if it isn't.


Ok. Now I'm seriously depressed.

I think I will head to this [1] again to celebrate our insignificance in the whole scheme. Due to the very likely assumption that the thing that contains the universe (and also is the universe) is not bound to time and quantum effects lead to every possible reality (because it has infinity at its disposal) this is just one of those reincarnations where the human nature gets in the way of social equality.

[1]: https://news.ycombinator.com/item?id=17559822

A link that expands on the middle class discussion and a George Carlin video:

https://www.thomhartmann.com/blog/2014/04/middle-class-not-%...

https://youtu.be/XdH38k0iUgI


I'm certain that the phenomenon of a widespread middle class is bizarre in the grand scheme of things. Exponential functions don't generally have a bulge in the middle of the curve. We had one for a few decades and may yet again for another few. In the long run, it's quite unlikely that capitalism can sustain an exponential curve of wealth yet have a bulge in the "middle class" section.


Is there strong evidence that this will happen anytime soon? I've heard many arguments that claim this wouldn't happen anytime soon, particularly because of the level of difficulty and the required determination.


I don't know the situation in the US, but here in Germany the companies always say "Oh no, we don't have enough computer scientists" and continue to pay low salaries. Many people thought that CS grads were needed and chose it, but they earn average salaries and sometimes even have to fight for the good seats.

Our definition of skill shortage is that there are only 3 applicants for one open position. This means a skill shortage doesn't mean there's no one. It means that there are 3 people hoping for the job, but the companies would love to see as many as they can so they can push the salaries down.

> particularly because of the level of difficulty and the required determination

Maybe my imagination is limited, but I think there's a finite set of things we need until it gets exponentially harder to make big leaps (all industrial nations suffer from this phenomenon in advanced sectors) - basically the law of diminishing marginal returns [1]. And we will continue to improve the developer experience so it gets easier to build computer programs.

[1]: https://www.investopedia.com/terms/l/lawofdiminishingmargina...


“We don’t have enough X” can also mean “teach more X in schools/tertiary education so we don’t have to pay for training”.


This is exactly what they're doing. I think it's irresponsible to manipulate the youth and telling them that everyone should be a programmer. The companies won't feel responsible for all the jobless or unhappy people they've created through lobbying and will blame politics. Man, I seriously hope I'm wrong, but I'm pretty sure I'm not.


Not only is Python easy to learn for beginners, it allows for some deeply expressive constructs and patterns that are difficult or impossible in other languages. This is why even though I started with C/C++ and Java pays the bills, Python is where my heart is. Thanks Guido.


Well yes, if you come from C/C++/Java, Python is going to feel that way ;)


Very appropriate timing. My son wanted to learn programming, so we started yesterday with python, using Jupyter notebooks on Azure (notebooks.azure.com). No setup, easy to create little code snippets, and he can do it all from his Chromebook. Python is intuitive, simple, useful; best first language in my opinion.


In case anyone else was confused by the literary reference, "one with Nineveh and Tyre" is from Kipling's "Recessional":

https://en.wikipedia.org/wiki/Recessional_(poem)

Which, I think, is the poem that made "lest we forget" a well-known phrase (though the original is from the Bible).

Nineveh (now Mosul, Iraq) and Tyre (in Lebanon) were great cities of their time, but now mostly known for their ruins. In Britain, they're commonly used as symbols of foregone empires.


Python is a great language to start programming in. My biggest frustration with traditional programming languages (well, C/Java mostly) was just how much upfront effort and understand is required to do useful stuff. Whereas with python it is pretty easy to whip up a script, read input and transform it into output in real time etc.

That said, I think its also important for Python programmers to dabble at least a little bit with Java to really understand how OOP works. Python does have OOP but it has certain quirks which can confuse newcomers. Java helped to clarify my understanding of OOP a lot, even though I've only used it marginally.


W/O going into the war of I'm better than you (Python vs JAVA)... Python is built to make it easy to write good programs... where as JAVA is built to make it hard to write bad programs.... same objective but different thinking/approach to achieve it... hence one(Python) would eventually have larger adoption....


I agree with this sentiment but it is a bit of a double edged sword for Python professionally. There is some really good Python code out there but there is also a lot of bad stuff too. I know this can be said for every language but I think I have seen it most in languages like JS and Python because of the ease of access.


A better measure would be the amount of bad code produced conditioned on the skill/experience of the author. I'd be more worried about a beginner C programmer than a beginner Python programmer.


I don't think Java's take on OOP is good to learn before you really understand OOP already. Its heavy handed style of OOP really hasn't aged that well and is rightfully being avoided in all newer languages.


I couldn't agree more. I don't know how to explain it exactly, but I had the opposite experience. Learning basics of functional programming in Python improved my OOP in Java over time.

I think it's probably more like this: learning a second, contrasting topic really helps explore the motivations that drive their respective goals.

OOP and Functional are broad classes of contrasting design patterns and philosophies but share the common idea: be principled about state.


There is not an Universal OOP specification to follow.

In Java's case, a lot OOP flavor's design patterns are created due to Java's lacking of some primitives like closure/lambda and higher-order functions back in its old days.

In short, Java's OOP solves many problems that are unique to Java and could be easily avoided in other languages.


I'm going to chime in to 100% agree with you. I'm a big fan of both of those languages and their respective ecosystems.

I'm in love with the idea that there a good case of polyglot (biglot?) program synergy between the two. Keep your eyes open on Jython and GraalPython!


Other than interfaces and generics (which would make no sense to add in Python, since it's dynamically typed), can you name some OOP features Java has that Python doesn't have?


This may sound super n00b but I really like that Java is explicit about everything. So for instance, you explicitly specify variables, methods as private with the keyword 'private'. In python you have to use an underscore, and its still not really private, its obfuscated. That's just one example, but in general I really like that kind of by-the-book verbosity, instead of having to mentally map it into the OOP concepts.


Double underscore is the real 'private':

    >>> class Foo:
            def __fn(self):
                return 2
    >>> Foo().__fn()
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    AttributeError: 'Foo' object has no attribute '__fn'
    >>> Foo()._Foo__fn()
    2
A single underscore is more like 'protected', ie. it doesn't prevent subclasses from easily/accidentally overriding your function.


I'm less concerned with encapsulation, but the explicitness is a godsend the larger your codebase or team.

Python keeps code much smaller but above a certain size I wish I could "promote" parts of my code to Java.

Partial typing in Python just isn't close to feeling the same.


Why would you want to actually enforce privacy? That makes it harder for the user to patch. Better to provide a mild discouragement via "_name".


Because in traditional object-oriented design you would extend rather than patch. It's the open-closed principle.

There are many benefits to enforcing privacy, and one of the biggest ones is that you can make changes to the internals of your library without breaking your users' code, because they never had access to the internals.


You can get that benefit with Python. Name mangling helps. It gets the benefits of `final` without the annoying enforcement.

Don't stop me from patching your buggy code. If writing an extension is more efficient, that's how I'd solve the problem.


That's an eternal ideological debate where there is always a rebuttal on the form "a-ha - but you could just...", but the rationale is this: Your class will probably be doing things with the private variables that aren't obvious, and will break things if you access them from outside. (Either by them having a different value than you expect, or expecting a different value than you set).

Of course, if you have a perfect understanding of the code, this will never be a problem, as the program is always executed as it is written. But most likely, not everyone in your organization will be as patient, smart and attentive to the details as you are.


I don't understand the argument, though. Python's underscore variables are meant to not be accessed from outside. The argument against them isn't that you might be doing some non-obvious things inside the class, that's the whole point of them!

It's a way for the class user to say "I know this is terrible, I know this might break, I want to do it anyway".


The obvious usage of privacy is so you don't expose an API you don't want to maintain. It's also occasionally used to maintain some invariants; for instance, you can have a public method that wraps around some protected interface that does input checking/clean up. Having the compiler enforcing privacy, in this case, guarantees that whatever actual method you dynamically dispatches to has the "right" input (accidentally calling the interface directly won't compile).


Private doesn't mean much. You can still access private members in Java using reflection.


This hole is incrementally being closed with the introduction of Java 9's modle system. You can only deep reflect into classes in your modules or those "open" to you.

There are both declaritive and command line param ways to open things so it's not as rigid as it sounds.


Check out Ada.


> can you name some OOP features Java has that Python doesn't have?

Encapsulation. In Python you can write code in an OO style and benefit from inheritance, but it’s really only good for people writing libraries and frameworks. For end user code, objects are best avoided because they can pick up state in unexpected ways, which leads to problems that are extremely difficult to debug.


Could you give some examples? What encapsulation features does Java have that Python doesn't? You can get private methods with double underscores. It's true that attributes in Python objects are all public by default and in Java you have to be more explicit to get public attributes, so maybe that counts, but otherwise what do you mean by "objects can pick up state in unexpected ways"?


So let's so you have:

  class Foo():
      def __init__(self):
          self.bar = 1
Then let's say you have foo_instance imported in your module, and you see that foo_instance.bar has a value of 2.

You can see that foo_instance is imported from module A. So you go there, and see that the instance wasn't created there, it was imported from module B. But wait, it wasn't created there either, it was imported from module C, etc.

Somewhere along the way bar picked up the value of 2. But it's almost impossible to figure out where because there isn't any requirement use a uniquely-named accessor to modify the instance, so you can't just grep through the code to see where that method was called. (And maybe the name of the instance has changed a few times along the way, for whatever reason.)

And what's worse, the code that modified the instance might not even be visible in any of the modules, it's possible that someone took advantage of Python's ability to change the way the importing system works to mess with either the class or the instance. Or maybe there is some global middleware that's messing with objects or instances, or unrelated module Z is secretly monkey patching module B somewhere in the middle of this process. Who knows, and good luck figuring out.

Whereas if you just write the user-code logic in functions then it's not possible for developers to create situations like this. That obviously isn't possible for a library where the entire point is to make everything extensible and where you don't know how people will want to use something in advance, but hopefully the people writing libraries and frameworks have good enough judgment not to completely abuse the tools.

I'm not sure if I can really articulate how this is different than Java or Swift or whatever, but for whatever reason it just feels different.


I understand your point. I think this is partly related to the fact that instance attributes are public by default and do not require accessor or mutator methods (though you can easily create them with the @property decorator) and partly due to the fact that Python is a very dynamic language.

You can make instance variables "culturally private" by pretending the variable name with a single underscore, and you can name-mangle (to make it effectively private, unless someone really goes out of their way) with double underscores, but I know that's annoying to do for most/all variables and that making them private by default has advantages.

Python is just not as strict a language as Java, Swift, C#, etc. That has advantages and disadvantages.

Personally, this hasn't generally been an issue for me, because I've rarely directly modified instance variables in my own code unless I'm intentionally doing something really "meta", and it's not a pattern I've seen much of in other codebases.


I think this comparison [1] in particular explains a lot of Python's continued success over the past 5-10 years. For a very long time, CRUD back-end programming was one of the biggest draws of new users to Python. In the last couple of years, interactive data analysis has completely overtaken it.

[1] https://trends.google.com/trends/explore?date=all&geo=US&q=%...


The great advantage of Python is the vast amount of libraries out there for pretty much anything you can think of. As a C# developer that's the one thing, and probably the only, that I envy from Python.


Back when I was getting more serious about programming (2010-ish), I had to decide between Perl vs Python as those two were the most widely used in my domain (bioinformatics).

I remember one of the argument for Perl was exactly this: the vast number of libraries in CPAN. You would be able to do tasks much quicker because you would just need to install these libraries and go on with your tasks.

Still, I decided to go with Python because it made so many 'click' for me about programming in general + because of its REPL, among other things.

It turned out to be one of the best decisions I have ever made :).


I really dislike converting Python to a different language. What type is this? Go 5 libraries down to find out ;)


That's a common misunderstanding. Python's type system is in the unit tests necessary to ensure the code is doing what it should. This is a boon because all of the behavioral rules are co-located for efficient reference. ;)


It's more a traits system than a type system. Who cares if it's a list so long as I can append to it?


This seems misspoken. Python hasn't necessarily brought computer programming to a new audience. A new audience is being brought to computer programming. They just happen to be given python as their first steps.

There is nothing intrinsic to python that is helping these use cases. Some people like it. Some people don't. I have not seen studies showing any real advantage or disadvantage to it. Do those exist, and I've just missed them?


Perhaps. Or perhaps programming instructors like Python better. When I taught statistics, R was more popular and featureful for the task than Python, yet I chose to teach my students Python.


If the competition is R, I can see how python is a pretty language. I think R still has prettier charts, with ggplot. However, there can be little argument that it is a peculiar language.

That said, I don't see much that python really has going for it in the pretty scenario, either. Most of the examples that make it worth looking at get ugly as quickly as most any other programming language. And the error messages are no more helpful than any of a myriad other languages.

None of which is to say I think it is bad. Just I suspect it is getting credit for some intrinsic quality that doesn't really exist. It is getting used because it is the language people know is getting used.


Not as any other language, but as any recently designed language. They (almost) all have good REPLs and a decent variety of data structures.

It's easy to compare Python against C or Java and say how great it is, but against Rust or Dart, maybe not so great. However, popularity has its own benefits.

Still, compare against Go. There's a major difference in philosophy of deployment. Python's modularity causes dev ops problems, but it also means that any user can monkey patch whatever they need. With Go you need access to the original source, which isn't necessarily provided.


Fair, I meant as any comparable language. Though, I confess I suspect people could make more progress with C and the like than the advocates of python would like to acknowledge.

And I'm quite fond of lisp nowadays.


Python's core strength, among the half dozen of language I used in daily work, is that it is simple and focus on easy human expressiveness.

It is the best example of 'simple is hard' when done right. Its core abstractions are carefully selected and weaved in a way, using them to organize/orchestrate your thoughts is just natural and fluid.

Huge thanks to Guido for presenting the world such a meticulously crafted work.


The Brian analogy was a surprisingly apt way to explain the PEP 572 kerfuffle


It's really interesting. I am a beginner to python an just read the above article. it's really interesting and I too enjoy coding with python.


I just love Python. It's my favorite programming language for many reasons but one of them is writing python never feels like the language is working against you. Python always feels like it's helping you.


Python is certainly way better for a layman trying to do small things than C++


True, but for me that python was my 4th CL (after fortran, c, c++), there were lots of gotchas, specially with implicit type conversions, mutable classes, and passing by value / reference which all seemed quite arbitrary in my first steps.

Maybe for a complete beginner these are no-issues.


Can you explain implicit type conversions problems in python?


bools can be silently converted to int. 1 + true is ok, but 1 + "true" throws.

Sometimes you do want to do math on bools, but it would avoid a class of errors if they would make you wrap them in 'int(the_boolean_variable)'.


> Sometimes you do want to do math on bools..

Can you provide a concrete example for when this would be useful? I assume you're not talking about binary arithmetic.


Here are some examples from the Python standard library:

  multiprocessing/pool.py:
      self._number_left = length//chunksize + bool(length % chunksize)
  test/test_math.py:
      tmant = tmant // (2*h) + bool(tmant & h and tmant & 3*h-1)
From NumPy and SciPy:

  numpy-1.11.0/tools/swig/test/testArray.py:385:
       sys.exit(bool(result.errors + result.failures))
  scipy/scipy/signal/signaltools.py:2003:
       n_out = n_out // down + bool(n_out % down)
  scipy/scipy/signal/signaltools.py:3001:
       n_out = x.shape[axis] // q + bool(x.shape[axis] % q)
Here's one from my own code, where I allow a query to be specified through --query , or as a hex-encoded query through --hex-query or as an input file through --queries, but I only allow one of them:

  if bool(args.query) + bool(args.hex_query) + bool(args.queries) > 1:


>Here's one from my own code, where I allow a query to be specified through --query , or as a hex-encoded query through --hex-query or as an input file through --queries, but I only allow one of them

This would be better written as a mutually exclusive argument group. Assuming you're using argparse: https://stackoverflow.com/a/13310109


"Better" only if you want the default error message. As the second rated answer for the question comments: "If you like to have more control about error message and the like, you can of course check the results later"

For example, this code:

    import argparse
    parser = argparse.ArgumentParser()
    g = parser.add_mutually_exclusive_group()
    g.add_argument("--queries",help="queries help")
    g.add_argument("--hex-query",help="hex-query help")
    g.add_argument("--query",help="query help")
    args = parser.parse_args()
    print(args)
When passed the command-line arguments "--hex-query AB --query CD --queries ASDF" it gives:

  error: argument --query: not allowed with argument --hex-query
My code gives:

  error: Can only specify at most one of --query, --hex-query, or --queries


Looks like the libraries use the technique to handle those conditional off-by-one situations that other languages might resolve with a ternary expression. Thanks!


FWIW, I only searched for lines containing 'bool(' and '+'. That means I missed some of the other uses, like:

  unittest2/compatibility.py:25:
    if bool(unc_path) ^ bool(unc_start):
  scipy/signal/fir_filter_design.py:266:
    pass_nyquist = bool(cutoff.size & 1) ^ pass_zero
  pypysrc/py/_path/common.py:65:
    if bool(value) ^ bool(meth()) ^ invert:
There's also a place where the bool is turned into an int, but that extra conversion isn't actually needed - the following would work without the int() call:

  scipy/stats/morestats.py:2083:
    correction = 0.5 * int(bool(correction)) * np.sign(T - mn)


  sum(x > 3 for x in iterable)
True == 1, False == 0, so this counts how many items in iterable are greater than 3.

But I think that's the only non-dirty way I've used it, and if that wasn't possible a trivial count() function could do the job.

If I remember correctly, bool subclasses int mostly for compatibility with pre-existing conventions. People were already defining their own "True = 1" and "False = 0" constants before they became part of the language.


If it wasn't possible you could still do:

    sum(x > 3 for int(x) in iterable)
And avoid a class of errors in other places.


That code is invalid, because you're using "int(x)" as an assignment target. What would you expect it to do?


sorry:

    sum(int(x) > 3 for x in iterable)


That works better if you expect things in iterable that aren't ints, but it's still summing bools. int(x) > 3 returns a bool, so in the end it's still doing something similar to

  sum([True, False, False, True, True, False])
But if bools didn't implicitly behaves like ints, but could still be converted to ints, you could do this:

  sum(int(x > 3) for x in iterable)


issubclass(bool, int) == True in Python 2. They're not converted to ints silently they are ints.


The point is they shouldn't be. Knuth argues against implicitly treating bool as int in Concrete Mathematics.


Agreed, however you have to take context into account. Python 2 didn't always have a bool type (granted, already a mistake), before it was introduced int was used instead. Making bool a subclass of int made it possible to use it even in combination with earlier code, which made adopting it trivial.

If bool wouldn't be a subclass of int, some projects probably would still be using int instead of bool.


Implicit string and unicode conversions were confusing in Python 2.

True + True evaluates to 2, although that is technically not a type conversion.


True + True is still 2 in Python 3.


It is not a problem, it was just non expected behavior for someone trained in strong typed languages.

For example, you want to calculate the performance of your ML algorithm by counting the correct predictions / total. My expectation was that the result of a division of integers, would be a float, specially if the modulo is not 0.


The division operator was one of the things that changed in Python3. In Python2 the (/) with integer arguments is integer division, rounded down. Python3 introduced (//) for integer division and now (/) always does floating-point division.


That's not an implicit type conversion, that's just a different convention for what '/' should do on integers.


I think this was actually addressed and back-ported to 2.2+, with PEP 238.

https://www.python.org/dev/peps/pep-0238/


In what strongly typed language does the division of two integers result in a float?


Many languages are not strongly typed (where "strongly typed" means "variables must be predeclared with their types (though they may be inferred)"). If your only experience is with strongly-typed languages, then MANY things about Python will be different, because Python is one of many languages that is not strongly typed.

For example, in Scheme (a Lisp), a division (/) presented with exact input values and a non-zero denominator will produce an exact value (an integer or rational). E.g., (/ 1 2) produces the exact fraction 1 divided by 2, and not 0.

In addition, both Scheme and Python have different ideas of what an "integer" is, e.g., both are happy to compute the exact value of 2 to the 9999 power. There's no requirement that "integer" correspond to the underlying machine integer.


Double check me on this, but I'm pretty sure your definition of strong types is wrong; that's static typing.

Python is strongly typed but not staticly. Weak typing is more like 1 == "1" is true. Aka implicit parsing or casting.


Weak vs strong typing doesn't have a standard mathematical definition but you are correct that it is usually has to do with implicit type conversions.


Decades ago, the Pascal camp hijacked the term for the requirement of explicit conversions in a static language.

Pascal would be strongly typed because for instance, characters and integers will not inter-operate without chr and ord, and also floating and integer conversions must be explicit. This is in contrast with something weakly typed like C. The weakness in C can be a problem. For instance, a floating to integer conversion that is out of range of the target type invokes undefined behavior. Also, when in range, it can produce a value not equal to the original integer, due to lack of precision. So an innocuous statement like f = i; that passes static type checks without any required diagnostics can go wrong. If a cast were required to make the diagnostic go away, then that at least creates a visible record in the program text that a dangerous conversion is taking place.

Another use of "strongly typed" in the realm of Pascal is that it refers to name equivalence for type alias definitions. The C typedef mechanism is weakly typed because "typedef int user_id_t" doens't create a new type; user_id_t is the same thing as int, such that a pointer to user_id_t is compatible with pointer to int and so on.


The language of the real world?

Nobody doing maths with pencil or paper will think 3/2 equals 1.

More to the point, the symbol 1 represents a float as much as it does an integer.


That result should be a ratio, surely


That's how it is in python 3.


Nice reference to Karel Čapek's "R.U.R." (https://en.wikipedia.org/wiki/R.U.R.).


To me, Python has represented the nice friendly face one can put on the mean nasty stuff written in C++


  python -c "import this"


Archive link for losers like me who can't get past the paywall: http://archive.is/8mRZ0


Meh.. try Perl.


When I was younger I used Perl for a lot of the things that I use Python for today: websites, command line tools, and one-off scripts. It's not a bad tool, and it has real strengths, especially for short one-off scripts.

I was in the pedantic "meaningful whitespace is evil" camp back then, so I avoided Python. Then I got an internship in a house that used Python. That forced me to give Python a real try and I got over the whitespace thing... I learned that for all intents and purposes this language I had been scoffing was a better tool for many of my usual Perl use cases and more thanks to its extensive standard library.


Why?


Peel makes me happy, Python makes me want to go to sleep.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: