If I recall correctly, the European Parliament doesn't have the power to introduce new legislation on its own. Instead, they can ask the Commission to present legislative proposals. This might be what's happening here...
Why would they be leaving it? I can easily connect to a Russian website with servers in Moscow just fine from anywhere in the world. The same applies to EU citizens.
I'm a strong believer in freedom, to include freedom from unnecessary taxation. If G/F "leaves" EU, I meant it by closing physical locations and the farce they have to put up with for taxes, they could still serve citizens from EU.
Id love to see the story unfold if G/F said "we are American companies only going to pay US taxes" and they quickly closed offices in EU. Do you think EU would create a firewall removing citizens from Facebook/instagram/Gmail/Google search etc ?
Seems to me that in this crazy scenario the EU crooked politicians are left making threats they can't really follow through. They all probably use gmail and how easy is it exactly to start a new on an email account ? How about all your contacts and pictures on FB?
Seems to me that in this world F/G would operate just the same and EU politicians would walk away crying saying "you can't do that it's not fair "
Is unnecessary taxation all taxation? By your argument every company should just find whatever the lowest tax area of the world is and set up their headquarters there, no one will be able to stop them!
Also, if you believe in freedom, why don't you believe that countries have the freedom to stop these massive corporations from extracting wealth from their society and returning nothing? Why does only the corporations freedom matter?
I don't subscribe to your second paragraph at all. "Extract wealth" how? I'm from the third world and I can instantly tell you the great value that people derive from applications like What's App and the like. To someone in EU or US it's easy to deride these silly applications, but to those who lack the infrastructure it gives new possibilities.
Take for instance the case of using what's app as a little store where people can offer services/goods. My mother in law dedicates her account on what's app to sell hand knitted baby clothes, and it provides her with reliable communication to clients.
Such applications are a NET BENEFIT to everyone involved and it's clear why What's App was given such a high price tag!
If I might ask what/where you are from that you think this way? I can tell you from personal experience and fundamental Adam Smith basic economics that these companies offer vastly more benefits to consumers and society as a whole than what little money can be taxed from them.
Also note, that Latin American countries and 99% of other countries don't propose to tax the tech companies and they still have vast benefits.
The EU has wealth in it from it's society and economy. The companies are extracting it by coming in, using their power to take control of the market from local companies and then not paying their share in taxes by using every loophole they can find.
Yes they provide a benefit, but you're bringing up Adam Smith like this is a situation with perfect competition. I might be on the wrong side of this and they are providing a benefit big enough thay the EU would get less net benefit from a local company, but can you admit that there could ever be a case where a company was acting in a way that was detrimental?
Corporations don't have money. It's also a giant cat and mouse game, governments want to tax and companies want to minimize that. So you end up with weird situations like Apple and Google hoarding cash because they don't know how to put it to work and they sure as shit don't want to lose it to the taxman[0].
So if you substantially lowered or entirely removed corporate taxes they would bring that money back to the US and spend it in R&D or capital investments, or return it in large forms to shareholders via dividends!
What oh what, do people usually do when they get sudden influxes of cash? They spend it or re invest it! They don't just bury it or burn the cash for warmth like Pablo Escobar. That seems like a better end goal than having the cash idly sitting abroad or being spent by crooked politicians who by virtue of being in office propose to be holier than thou and always manage to spend money not earned by themselves
I consider all forms of advertising that distract me from the content I'm looking for bad for me. I don't want flashy colored banners telling me what I should buy. There's already plenty, plenty of distractions already, and advertising is one of the poorest forms of it. I don't think I'd mind if advertising, in its current form, died.
The article doesn't portray a realistic view of crime rates in Italy - they say "violent death is commonplace", but I don't think that's ever been true, and it certainly isn't now.
Be careful, as always, but Italy isn't an especially dangerous place. Some regions have higher crime rates than others, but even then, I wouldn't worry too much about it.
> they say "violent death is commonplace", but I don't think that's ever been true
The Greens happened to be in a bad place at a very bad time - late-80s/early-90s saw big changes in organized crime and political structures, with all the trouble that sourt of situation will inevitably bring. Things are nowhere as tense today. Unless you specifically head for the ghetto, the worst that can happen as a tourist is overpaying for taxis.
Well, I have no issues building a desktop, but if somebody asked me about what (modern) components to choose for no driver issues, I wouldn't know.
Avoiding issues and getting support could certainly be worth something.
I find that buying 6mths behind just under the state of the art gets you 90% of the performance often for half the cost.
I've been custom building my desktops since the 90's and I can't remember the last time I had an issue with hardware, I think it was a Geforce MX440 so ~2003.
The driver story on desktop hardware for Linux is absolutely great (in my experience) if the hardware has been out 6mths or more or uses a core chipset/device that has.
I have given this advise on Hacker News before: on eBay 2-3 year old workstations can be had for 300-400 Euro. Typically these are workstations from companies that replace machines every 2 years or so.
These are typically equipped with Xeon CPUs, plenty of memory, sometimes ECC etc. Moreover, since they are usually HP/Dell workstations, they are certified to be compatible with Red Hat Enterprise Linux, so all the hardware is pretty much guaranteed to work.
This is very cool. I wonder how much the electricity costs would be if I let that machine stay on 7/24 for a year. Power, in Germany, is a bit expensive (learned the hard way).
ACPI is a busted-ass standard. OEMs are free to, and do, do pretty much whatever. Again, the only relevant standard from an OEM's perspective is "does it work on Windows".
Last year I bought a i5-4690k, some slightly outdated but top of line mobo, 32gb DDR3 2400mhz and a bunch of 1tb WD Black... Total price was about 800USD yet it feels like I had bought something much more expensive, everything I throw on the machine runs fast, even the HDDs are fast enough that I don't feel need for SSD.
I ask because I have been thinking about having my next card be an amd one because I'm tired of having to deal with proprietary drivers with my current nvidia card...
I was using only nVidia my whole life, and got tired of some of their business-related bullshit.
So I thought AMD was going to be better, because they try to be good and nice...
Well, AMD hardware is NOT good as nVidia (example: 380x in particular is really fast, but EXTREMELY power hungry, so much power hungry that AMD had to greatly cripple it, sometimes it starts stuttering heavily in games before it gets hot, and when I look at logs, the reason was it reaching power usage limits).
And they are bad at marketing, but also do the bad things that nVidia do at marketing, for example AMD shills do exist, I got banned from chat rooms after asking how to fix bugs (because they want to give the impression their drivers are bugless... but they are complete crap too, even their Open Source driver for Linux is so much crap it was entirely rejected by the kernel team), they deny their cards have physical bugs (RX480 has same issues as 380X, but ALSO has unbalanced power usage, drawing too much power from the mobo and damaging it), and so on...
I tried asking for help with my card issues with both AMD, Sapphire (the manufacturer) official and non-official channels, and I was treated very badly, people would ignore tickets, give me non-sense information, and several times they told me to just return the card and buy another one (I can't do that because I purchased my computer in US, but I live in Brazil, if I could do that I would have switched my 380x for a nVidia GeForce 970, back when I bought the 380x they were in several countries the same price).
Also AMD drivers don't crash the OS like nVidia ones do, but they crash a lot more, in all OSes, AMD drivers restarting (And taking your game/software with them) is fairly common, also weird error messages (like updater crashing, control panel crashing, etc...)
> their Open Source driver for Linux is so much crap it was entirely rejected by the kernel team
No, amdgpu was not rejected by the kernel team. A particular implementation of the driver was rejected because it implemented an abstraction layer, and that would make it nearly impossible for kernel devs to maintain.
> drawing too much power from the mobo and damaging it
If you could point to an example of this happening, I'd appreciate it. My knowledge of the situation is that some models of the RX 480 can run slightly out of spec, pulling a little too much power from the motherboard. Any motherboard I've heard of could withstand that. And if you really care you can enable an option in the driver that causes it to run strictly in PCI spec.
I'm not an AMD shill, I just think you've misrepresented some of the issues at hand. AMD make mistakes, for sure. But not every mistake is as crippling as you've implied.
"slightly" out of spec you mean pulling 7.7 amperes from a part rated to 5.5 amperes, and that might (due to dust and other factors) pull all the 7.7 amperes from pins that are supposed to have only 1.1 ampere running trough them.
Just look in the AMD own official forums for threads created before they made driver patches, for example in one thread a guy put a photo of his molten blockchain mining rig, and then there was several pages of people calling him a nVidia shill, and noone helping.
The handling of the incidents were so bad I stopped visiting AMD forums entirely, it was just pure hostility to anyone with any problem, even unrelated problems.
I couldn't find the post you were talking about (I was really hoping you could provide a link). Instead, I found a thread filled with people talking about how you have to be careful with your mining rigs because "Any electric appliance can catch fire." [1]
Bitcoin mining isn't a great example; if you look further into that thread, it's not just AMD users whose rigs have caught fire in the bitcoin mining situation.
for some reason I usually get downvoted when I bring this up, but I still believe NVidia is a better experience on Linux than AMD.
Like you, I always ran NVidia because of their support for Linux, but recently tried to use an AMD card for a Linux build. I ended up buying an NVidia instead, and all my problems have gone away.
For me, the big problems with NVIDIA drivers started showing up when I moved to a rolling release distro (OpenSUSE Tumbleweed). Proprietary drivers don't like frequent kernel upgrades.
The 380x and 390x are HOT.
Newer Radeons are much better in that regard, and generally very decent cards - 470/480 and the recent rebrands/reclocks 570/580.
MY 380x only get hot when using the default fan control, that is complete crap.
With my custom fan curve it starts to get limited in performance while still around 60c (and fan noise is still not 'perceptible' over the sound of a game for example), because it instead hit power limits.
The 380x has same power limit as 380, despite having more GPU power available and double the RAM, I have no idea why they made such crappy decision.
Is power problems are so severe, that undervolting the card make it MORE stable and faster, because it reduces total power usage, and triggers the power limits less often. (same thing apply to 480 by the way, people found out during the 'PCI slot melts' crisis that undervolting it made it behave much better).
I still hit bugs with Arch and I know they can hit Ubuntu as well[0], for instance, If anyone can connect to an amazon echo as an audio device I'd like to know what version of bluez and/or pulseaudio they are using, as it stopped working after an update a while ago...
[EDIT] Though to be fair, I think they did call out bluetooth as one of the things Ubuntu was going to focus on in their next release IIRC.
Ok, I would assume this is something that might differ between countries. Here in Scandinavia I've met extremely few people running their desktops exclusively on WiFi since it's generally unreliable when having lot of devices talking to same AP/WiFi router in a noisy environment with lot of other networks taking up same frequencies. Actually, I would say running Powerline to desktops is more of a standard approach here.
Uhh.. Hi, I'm a Scandinavian (Swede). I got a desktop exclusively on WiFi. I've never seen an office or a home actually use "powerline" (Network over electricity network).. Oh well, one anecdata against another anecdata :-)
Powerline networks are mainly utilized by people living in concrete multi-story houses that either do not wish to install a proper Ethernet backbone to all rooms or take a gamble with WiFi due to higher cost, thickness of walls or other reasons that might seriously affect connectivity as mentioned before.
I used to run my own IT support company with several employees (think Geek Squad) having both enterprise customers aswell as private sector and people running Powerline is actually a lot more common than you might think (my parents for instance are running it along with WiFi in their house - deskop, IPTV and camera surveillance is on Powerline while tablets and phones are on WiFi).
During all my years in IT I've encountered two situations which I can remember where WiFi was used on a desktop machine instead of Ethernet or Powerline - one was a car dealership where they had a salesman sitting in a "glass box" and they were sharing building with another company (so, no Powerline) and other situation was a enthusiast that built himself a new computer and his new Asus motherboard came with 802.11n built-in..
Personally, I am running all my desktop machines on a 10Gbit CAT6a network I have at home (I do have a WiFi as well but it is on another VLAN with no access to network infrastructure - mainly used by kids) where I have possibility to stream multiple 4K streams from my FreeNAS server while downloading huge files of the Internet without even breaking a sweat - try to do that over a WiFi connection and you'll hit into a brick wall pretty fast.
The reason wifi is more common in the US has to do with cost.
Most people when buying a new house can stomach (not sure why, because it can't be a large portion of the overall cost of a new house) multi-line pulls from each room to a central wiring closet. Plus, you have to have that central closet (or panel at a minimum) somewhere out of the way, and most people just don't get that kind of tech (the idea of a central area for a home server, plus networking stuff, etc).
So - the lines aren't installed (at one time, houses were offered with the option, and if you are willing to pay today, you can still get it - but most people don't). After the fact retrofits aren't done because such an install is very difficult to do (especially in modern houses with horizontal firebreaks between the verticals, little to no attic with vaulted ceilings, etc) - which also means its expensive.
So instead, people go with wifi. It's cheap, no need for a dedicated wiring/network termination panel and/or closet, and can be taken down and taken with you if/when you move.
Personally, I prefer a wired system; when I moved into my house I installed a few drops myself where I knew there'd be some dedicated hardware (TV area, my office, library, and my shop); the other rooms I never installed anything because it didn't matter. For those, the wifi I have fills in those blanks adequately. I ran all the lines back to a custom wiring closet I built in my shop, and terminate everything there (plus a few of my servers live there too).
This sounds plausible to me. Here in Omaha, when I lived in an apartment and there were 30+ APs visible I had to be careful to pick the frequency based on what worked and what didn't, and when I did get it to be reliable I had short range. I am pretty sure this was just because of noise and cross-talk.
Now in a house I see maybe 10 APs and they are all at the edge of their range and I rarely need to tinker with it and it works all the way across the street.
Happens all the time. Lots of people in apartments/houses who don't want an RJ45 running through the hallway nor do they want to pay for RJ45 wiring (particularly when renting).
Interestingly, in the last 10 years I've not met anyone who bothers with RJ45 in the home. Everyone in the UK gets a free wifi router with their broadband, and tends to just use that.
Not saying you are wrong, just different areas are different.
My parents live in a 200-year-old house with Cat6 ethernet in the walls. They needed to replace the electrical wiring, and decided to get it installed at the same time. I don't think they've regretted it, especially as thick walls attenuate the wifi signal. I'm pretty sure this is unusual, though.
More broadly, I think home desktops are getting rarer in the UK. Wifi is the obvious answer for portable devices with wifi capabilities built in, even in dense housing with lots of devices interfering with each other.
That definitely is unusual, but when you're doing a whole-house electric refurb (given what little you mention, it sounds like a tube-and-knob switchout, right?), you likely have everything torn up to hell and back, so you might as well fix or add anything else behind the walls while you can.
People do. One of my friends asked for a "USB to USB" cable last week. (you have how many phones but no USB cables?) Turns out that he wanted to connect the USB type A port on a WD My Cloud (NAS?) directly to his desktop, because connecting the drive via ethernet to the router and transferring over wifi from his basement desktop estimated that it would take 2 weeks.
I would say the vast majority of people use wifi for desktop. (Not myself personally except on the third floor of my house). You might be surprised by the percentages.
also don't underestimate the power of stealing your neighbor's internet. I have a fun little router that's named "dontstealmyinternet". I kept the router's default passwords but have it blocked. It gets about 5 attempts a month from new machines.
My < 6 month old, $4500 USD desktop has 2 x 1 GbE plus 10/100 BMC, but I put an Intel 7265 PCIe card in it and use that instead.
I do use one of the Ethernet ports but it just goes to another router next to my desk (connected via crappy Powerline to the rest of my network) for testing w/ KVM.
Consumer machines have WiFi pretty standard now, my desktop has WiFi. It came that way from the manufacturer. Not everyone wants to rewire their house to where they want their computer.
I agree, but this is more of a documentation issue than anything. I stopped bothering to look at the Ubuntu "supported hardware" page a few years ago because I could tell from the graphics card listed (and not listed) that it hadn't been updated since about 2011.
I appreciate that hardware testing is complex and expensive, but I'd love to see an annual "high spec" and "low spec" Ubuntu reference build, with a price tag of maybe $1500 and $600 respectively, that have been tested and confirmed working with the current LTS.
That having been said, I wouldn't pay system76 a premium for it. I'd do what I've done every year so far, which is search a bit and then ultimately buy what I want and cross my fingers.
Someone could do the legwork of selecting one such system each year, put it up somewhere and collect referral commissions. Much like voter guides help those interested to gain political clout just by doing their own research and publishing it.
That wouldn't represent a support commitment from Canonical. I'm hoping for less of "this worked last time I tried it" and more "We've found this hardware fairly easy to support, and we're willing to commit to making sure that this specific combo works flawlessly in all cases, and will have functional upgrade paths".
Thinkpads are great but if I buy a Thinkpad T-series laptop I'm forced to buy Windows with it.
Also it's not supported by the manufacturer. Thinkpads have great support from the Linux community but Lenovo doesn't officially guarantee its use (I would be happy to hear if I'm wrong on that).
Thinkpads are great if your standards for screen quality are incredibly low. I can't speak for the ones released in the last couple years but seemingly all of the older ones have garbage tier screens. My T430 has a screen that is at best about as good as the screen that came with my 2009 Asus netbook.
My T460s definitely has a garbage tier screen. And garbage tier trackpad. The trackpoint feels like garbage too. And I had to replace the keyboard once because it was garbage and broke.
T450 owner here, running Ubuntu 16.04 LTS Desktop. My machine hangs when I unplug it from the Thinkpad dock. X rendering glitches are an hourly occurrence and X crashes weekly. In laptop mode, the trackpad handling is not up to snuff either. (I guess if I wanted all those features, I could just install Windows.)
You will most definitely want a recent kernel (16.04 ships with 4.4 - you want 4.8 or later). Also check the Arch wiki about Intel video to see if you can fix your problems with X (https://wiki.archlinux.org/index.php/Intel_graphics), eg. switching from SNA to UXA.
Thanks for the link -- I'm still hanging when unplugging from the dock, but switching to UXA and disabling 3D accel seems to have quieted the glitches down.
I did try some newer kernels a few months ago, but then my wifi stopped working. But this should not be necessary when running a flagship LTS desktop Linux on extremely common hardware from two years ago! I expected more.
Linux hardware support often takes years to mature -- for example, the graphics hardware on my Haswell laptop has seen steady improvements despite being years old. I would strongly recommend against using a LTS linux release from around the time your computer was released. Its just going to be too old. Try Ubuntu 17.04, or take the jump and just use Arch. I use Arch on all my work and home machines, without issue. Maybe twice a year requires 5-10 minutes of extra work during an upgrade to ensure a package update works.
>I would strongly recommend against using a LTS linux release from around the time your computer was released. Its just going to be too old. Try Ubuntu 17.04
No point in doing that. Just use the hardware enablement stack. It gets you the 17.04 kernel/X/etc over the LTS base. Best of both worlds.
For what it's worth, I tried a few different OSes, including Ubuntu 16.10 (which was the latest at the time), before settling on 16.04 LTS (which was released a year after the laptop came out). I plan to try 17.04 when I get some time, but I also expect to be disappointed. I got about 10min into an Arch install before laughing myself into a coma -- I've seen smoother Unix installs from the 1990s.
But I will reiterate: this is clown shoes. Expecting this kind of effort from desktop users is hostile.
I'll agree the the Arch installer could be better, but its not difficult, especially having done it a few times. The install starts pretty barebones and then you add on what you need - annoying if you want a 1-click to fully setup GNOME (or whatever) desktop, but perfect for people like me. You also get the benefit of learning how all the parts of a working Linux desktop come together.
Personally I find all-in-one installers annoying. I find I have to spend a ton of time removing crap I don't want and replacing it with what I do want. It would take me probably as much time or more to install and configure Ubuntu as Arch would.
You might want to check out Fedora too -- I hear the latest release is pretty great. Arch based distributions like Antergos or Manjaro might be good to check out if your only hangup with Arch is the arcane installer.
Re: the X issues... dump the Intel X driver for the modesetting driver (you will want a recent kernel for this). Made a world of difference on my Intel laptop.
That's not the case always. My first cheap thinkpad worked great, but the 2nd expensive one (T440p) has bad driver support for Wifi (on current and last Ubuntu LTS). Connections are unstable and throughput is ~0.3x of the dongle I use (both 2.4Ghz 802.11n). Hardware - Realtek RTL8192EE PCIe.
Always opt for Intel hardware. They have Linux device driver developers on staff. Sometimes the newest chips are not supported, but they always release something within a few months.
There are many fine linux laptops in the desk-home type, 14"-17". The selection isn't so fine if you're looking for something that'll fit in your lap on the plane or train, and have 8G of RAM. Or at least 4.
I just used mine on a Ryanair flight a few days ago, cramped in the middle seat for 3 hours. The XPS 13 fit perfectly on the very small folding table, the top of the screen just sticking under the bottom of the pocket-thing that is always full of useless flyers.
I believe it has the same footprint as a MBA 11. And of course, I'm running Linux on it :)
X1 Carbon series works great with Linux. Its a little bigger than 11", but still very light and slim and a nice display. I use a Gen 1 (2012) with 8GB ram. Installed originally Ubuntu 12.04, upgraded to 14.04 and 16.04. All worked great. I presently run Arch on it which was also easy to install and works great.
You just trampled over "doesn't sacrifice performance for thinness". And using on my lap in a plane or whatever, like maybe I would when laying on my couch, was never even considered. Are you sure it's a practical measure to judge by a product and isn't just an Apple-tailored one?
No I didn't, I'm saying that if a laptop doesn't have an acceptable form factor, its performance (or even existence) doesn't matter. I'm also saying that models in the T series do not offer form factors suitable for everyone, contrary to the GP's claim, and offering the Macbook Air 11" as an example of the kind of form factor not offered.
My wife has an X240, something in that series is equivalent to the Air. To be honest I would just go for the Apple, we have had quite a few problems with the Lenovo.
Ack. That's roughly where the CPU is, I wonder if some of the solder balls are cracking and it needs a reflow. I've never had to deal with Lenovo service, but it might be worth contacting them -- that is definitely a hardware issue.
Bumblebee has kind of fallen into a 'not officially supported' state as of xenial, but it will work if you are willing to spend a bit of time cajoling it by messing with drivers and blacklists and config files.
Nvidia also has an official solution now called 'nvidia-prime,' but it's awful. You have to log out and back in to change which card you're using, so you can't just spin up the discrete card for one or two taxing programs in your workflow.
But it can work the way you'd expect it to, if that's what you're asking.
I had bumblebee on my precision 5520, and it works fine. You need to meddle with it a little bit, but after that everything works fine. Actually, I loved it since I could have my X11 memory space controlled by Intel and my Cuda application development wouldn't have messed up the X11 while running. Something you expect to not happen, but happens all the time.
Bumblebee and optirun can be better, but it is usable right now.
Does it allow to connect external displays? Last time I checked bumblebee worked almost fine, but it didn't notice I connected an external display to DP hardwired to dGPU, and dGPU stayed powered-off. I got through a few workarounds for that, and even managed to get something incorrectly displayed, but nothing really worked like it should, so I gave up and I'm using this ugly nvidia-prime thing and just remember to set it in "performance mode" before using my laptop with external displays.
Nah, they're really good I find, maybe not as good as the apple glass trackpad, but certainly not bad. Plus, they have the awesome physical buttons along the top, combined with the trackpoint.
* Does not properly suspend (i.e. wakes up immediately when suspending, shuts down instead of suspending)
* Does not properly resume (i.e. kernel crash on resume)
* Sometimes does not properly resume (even more annoying to debug)
* Resumes randomly, when you don't want, often turning your backpack into a forge.
- Hibernate mode doesn't work (at all, your hardware has been blacklisted).
- Plugging in an external monitor occasionally causes everything to crash (but sometimes just compiz).
These are the most annoying problems I have on my Linux laptop. Admittedly, mine is not Thinkpad, but looking at reviews on the latest Thinkpad, at least the battery life issue seems to be ever present. These are pretty much the same problems I've had for the 10 or so years I've been running Linux on laptops. I would have thought they'd been fixed by now. 10 years ago, Windows had a bunch of these problems too, so it was excusable. Now, it's just embarrassing.
I still run Linux on my laptop because I like the dev environment and tools so very very much, but I would pay serious money for hardware that was guaranteed to just work (tm) with Linux, with all of the above solved by the vendor rather than by me. I used to enjoy these little problems, but now they just annoy.
The sleep mode problems are the most annoying to me, the most elusive to solve, and the most impossible to predict from reviews :/
Wifi works perfectly, suspend/resume, docking/undocking too.
As for battery life, it was around 19W/h when I first switched to linux after FreeBSD. After installing tld and powertop it is stable around 10.8-12W with wifi enabled.
Maybe you might want to try a recent distro, I'm using Fedora and I really like it.
Even my 3G usb dongle worked flawlessly with zero config.
PS: I remember having a flaky wifi under Debian 8, but that was due to an old version of wifi driver. It has since long been fixed in every distro I tried -including Debian-.
PS 2: My laptop is pretty old (x201), so your mileage may vary. You might want to check out thinkwikis for further info.
Partly, that's because the x201 is so old. It's had about 7 years to mature support.
I had an x201 new, and I ran into all those problems listed above for the first two years. Hell, I had to use a USB WIFI dongle for the first year or so because the drivers hadn't stablized.
Yeah, I'm so tired of hearing this come up when people are looking for Linux laptops. It's very old, and VERY ugly. Most of us want something modern that runs Linux well.
I can only speak about myself of course but running Ubuntu 16.04 on a thinkpad x250 I have absolutely none of the issues you have listed above. Maybe you hear more about people having bad experiences than good ones?
I remember having these issues on an x60 series maybe 8 years ago, but my friends who have thinkpads are all running them fine, even on the jankiest distros with a bit of careful driver picking
Owner of a Thinkpad X1 3rd gen running Debian (started with Jessie, now Stretch), and my experience is quite different but there are some things to know. Let's trade anecdotes:
> - Wifi is flaky (ier than on windows)
No problem there, always been rock solid. The chipset is likely to matter, my laptop uses an Intel chipset. Performance wise Intel may not be the best, but the Linux support has always been good in my experience.
> - Battery life is shit (ier than on windows)
A very common misunderstanding, and very easy to solve. The thing is, a stock Linux distro is made independently of the PC hardware that will run it. There's no integration like any PC vendor does when installing Windows, making sure the Windows configuration is well tuned. In order to be functional on most devices, a Linux distro is typically conservative, and will typically stay away from enabling low-power modes that are flaky on some crappy PC models.
But for most tier 1 PC brands, the hardware is fine and it's perfectly safe to enable aggressive low-power. So just install a package like The Laptop Project (tlp), or the older laptop-mode, and you're good to go. You can even tune the configuration, it's simple and well commented. For example, with a fast SSD (no spin up/down), one can be very aggressive on putting the drive into low-power.
With this done, taking about 10 mn tops, I have a longer battery life on Linux as on the stock Windows8.1. And this is as reported by the firmware through ACPI, so same estimator on both sides.
> [Various sleep mode issues]
There was a very nasty bug in Linux MMU set-up that's been solved in 4.8. Before this, it could trigger some random and sometimes hard to reproduce bugs on some models, leading to crashes on resume. I've been affected, and it was a pain. The bug was there for a long time apparently.
Since 4.8, it's been rock solid. Zero issues. And it's really night and day in term of user experience. In case some of your issues were related, you may want to make sure you're running a recent enough kernel.
As for the unwanted wake-ups in a bag turned into an oven? Only ever happened to me on my work TP running Win7. From experience, sleep is not perfect there too.
No experience on using an external monitor with my Linux laptop.
One of the main weakness is that there's no ODM integration if you install Linux yourself. With big brands like TP, it's still mostly been smooth in my experience, except for the nasty resume bug fixed in 4.8. If that's a problem for you, there are now vendors with pre-installed Linux. Then it's a similar situation to Windows.
> The thing is, a stock Linux distro is made independently of the PC hardware that will run it. There's no integration like any PC vendor does when installing Windows, making sure the Windows configuration is well tuned.
You make it sound like Windows needs to be fine-tuned (by the vendor) to provide good battery life. This is absolutely not the case. You install a bare Windows 10 on a random laptop, and battery performance will likely be much better than on Linux.
Anecdata, but my desktop Lenovo workstation's suspend function worked well with Linux, but after an update (few months ago) it never resumes successfully. Nothing in logs -- just simply doesn't wake up properly. (4.10 kernel.) These are painful things.
> Anecdata, but my desktop Lenovo workstation's suspend function worked well with Linux, but after an update (few months ago) it never resumes successfully. Nothing in logs -- just simply doesn't wake up properly. (4.10 kernel.) These are painful things.
That sounds like my experience with Windows 10 on my gaming PC. I only use that machine when gaming, and while it has a <10 second cold boot time (god I love NVMe), I prefer to leave it running and let it fall asleep after a few minutes of inactivity. Some time last week or so, I noticed it never cycles fully to sleep; it will fall asleep and almost immediately wake up. I'm positive this was due to a Windows update, as I haven't changed any settings on it before or after the incident first occurred.
Now, this is on a PC I built, but I used a common motherboard (Gigabyte Z170M) and never had this issue on my previous build, also based on a Gigabyte Z series board. My wife's computer is a mini-PC made by HP, and it started having the same sleep/wake issues during the same week. Something in a recent Windows update has affected sleep states.
I had similar, terrible issues with my gaming rig when I let Windows auto-update from 7 to 10. I found that there is an option in system update to "restore" or "auto-fix" the OS. You might start by trying that.
I found that I needed to let the entire thing be wiped (including all software) and re-installed in order to get it working. A long time and complete pain in the ass, but it's much better now.
Just as a piece of warning if you go that route: MS decided that my legit MS office keys were "Pirated" because they were old and wanted me to upgrade (after telling me that it was a valid key 3 hours before) so I told them to pound sand and I was going to buy MAC's from now on, and I'm not a fan of Apple at all. They offered me nothing, but the chance to give them more money.
> Just as a piece of warning if you go that route: MS decided that my legit MS office keys were "Pirated" because they were old and wanted me to upgrade (after telling me that it was a valid key 3 hours before)
This happened to me after my first upgrade to Windows 10. I had a legit copy of Office 2010, and when I upgraded my Windows 7 installation to Windows 10 during the free year, I opted to do a clean OS install after 10 was activated. Upon reinstalling Office and inputting my key, at first it activated then it threw my Office install into an unactivated state and told me to contact my administrator. Umm, what? I'm the administrator and this was a retail purchased and licensed copy that worked fine before being installed on Windows 10. I even tried reverting to Windows 7 and installing Office on that, but it never activated and gave me the same message.
Thankfully I don't really rely on Office anymore and can get by with F/OSS alternatives or Office Online, but it definitely sucks that Microsoft appears to try pushing its business customers into O365 subscriptions and away from traditionally licensed software using what I feel should be illegal tactics.
Completely agree with you. I have in writing that my key would be good, even though the MS site said that there was an error and I needed to check with Customer Service. I then explained this to three people, whom found the written statement and said they couldn't/refused to fix the problem. They just wanted to sell my an O365 subscription.
As to how legal their tactics are, I'm not sure. I do know before I would never have considered anything other than MS, I'm now left to moving onto Mac's because I cannot give them more money and the work I do tends to now work in the Linux or alternative OS environments. Lots of industrial software that is touchy enough as it is...
I only boot into Windows once a month or so. This Monday, when I had my laptop sitting idle for a bit, I noticed the sound of the hard drive settling down and spinning up again in a regular pattern. At the time, I thought it was the Antivirus deciding to do an idle scan just when the OS put the drive to sleep, but now I think I might be affected by the same bug you mention.
I have been on the X2XX series for over 10 years, counting 5 laptops.
EVERY single thing has worked with Ubuntu (every single six months release, since 12.04 only used LTS releases) and required little to no effort.
As for a "Desktop" I haven't touched one at work since 2004. TBH I believe only gamers care. And gamers like to build/adapt their own hardware. Unless you can differentiate heavily and have something unique (something like building a RED camera or a super fast Electrical car) how is that going to fly in a marked that is in decline?
Keep yourselves to building a super high end laptop that can rival a Lenovo X series model and we will look into that.
They may work "perfectly fine", but they rarely work perfectly. There always seems to be some thing that doesn't quite work (function keys, HDMI output, card reader, graphics card switching etc.).
I haven't enjoyed building computers since I was a teenager. I'd much rather pay for someone else to do it. I pay for several people to come around my house and do things I know how to do but would rather spend my time elsewhere. This is no different.
Exactly this. If you're a typical Linux user, you've probably already built computers from scratch. A MBP/MBA laptop aimed at Linux is where the need actually lies.
WiiU has a lot of great games you can play right now and it will take Switch few year before it reaches a similar level in it's life cycle. Also, it's the best console to play Wii games right now (and possibly ever) and there's plenty of games available from Nintendo's older consoles available on eShop. Considering that in few months it will be nearly impossible to buy a new WiiU I'd say it's the perfect moment to buy it.
Prices for used Wii Us are around 200 EUR in my region. I'd say that's not a bargain and makes the idea of a new Switch for an extra 100 bucks way more attractive.
The snark is unnecessary. Of course they exist. Whether they are "dirt cheap" and whether the associated trade-off in quality expectations is worthwhile are entirely legitimate debates.
For example, I saw refurbished Wii U consoles for $200 the other day, and I'd be hesitant to spend $100 or $150 on a device with no warranty. But for $200 it's not crazy to think you might as well go for the Switch at $100 more.
Just switch to Firefox. This kind of stuff is why we need plurality in the browser space, and Mozilla is one of the few companies that takes - or has to take - a pro-user stance on many issues.
I'm curious about your train of thought, how is Firefox and Mozilla are pro-user? And let's not take their PR into account, just facts of what they actually did so differently from Google to make them noticeably pro-user in your opinion.
It's easy to disable the little bit of surveillance they do to make money. Whereas, Google keeps surveilling people in more ways over time in their various platforms. The difference probably comes from whether users or advertisers are primary focus.
What made you think Firefox and Mozilla are so anti-user? I didn't get many claims along those lines back when I was evaluating them.
Maybe it's just me being silly, over-critical and nitpickng minor things, but... Various design decisions didn't feel right to me.
Few examples that come to mind:
- Sync is just terrible. It's insecure, awfully overengineered, poorly documented proprietary mess.
- Mandatory addon signing was understandable, but still didn't felt exactly right. Probably because I'm a luddite don't fancy those walled garden app stores, and that somehow resembled those.
- Moving to WebExtensions-only is going to hurt badly. AFAIK it was announced they'll soon stop signing new non-WE addons for Firefox 53 (which is quite real soon). I don't want a Chromium clone with another rendering engine and Firefox Account instead of Google.
- DRM support. Browser market share, user requests, etc, but still - thanks for helping that cancer spread more freely.
- Test Pilot instead of just publishing an experimental addons on AMO feels weird. Especially the fact that those addons self-uninstall after someone says the experiment's over. Well, it's Mozilla work and it's their decision how they want things to be, but it just doesn't feel right to me. FLOSS used to be somehow... different in days back there.
- Pocket integration was sort of controversial. I've used Pocket's extension, but it surely didn't belong to the browser core.
- Some UIs were dumbed down to the extent of being barely usable. Some comments blame Chrome hiding TLS info here, but Mozilla had pioneered that (although to a lesser extent).
I'm about to self host sync myself so I'm interested in your claim about it being insecure. I won't sync passwords or form fields, because I don't store them in the browser. Only browsing history and maybe tabs, but for sure I don't want to send all my desktop tabs to my phone. Most of them won't make any sense there.
In short, the crypto itself looks okay (I'm not a cryptographer!), but the auth form you see is served over the network. It doesn't send password back - just passes it to browser runtime, so it would run KDFs on it - but you won't know what you'll get served next time.
As for the protocol - there is WebDAV. Seriously. It's functionally equivalent to what their blob storage does, except simpler, vendor independent, and doesn't mandate any particular auth schemes. Oh, and their auth protocols are total mess (BrowserID, HAWK _and_ OAuth - three different protocols are necessary to just talk to the damn system!). I get it, three teams were working on different pieces (accounts, tokenserver and the actual sync blob storage), but they could've at least tried to not invent that up, but use something standard. Or, at the very least, settle on a single protocol.
I'm saying this as someone who had spend some time reading docs and reversed engineering the rest, and had implemented almost-working (sans some undocumented oddities and a few lazy omissions leading to glitches, but it mostly works-for-me) standalone sync1.5 server - same functionality could've been done in a much more saner and simpler way.