Jump to content

Wikipedia:Reference desk/Computing

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 24.255.17.182 (talk) at 21:06, 15 January 2017 (Modular arithmetic with limited integer range: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.


Welcome to the computing section
of the Wikipedia reference desk.
Select a section:
Want a faster answer?

Main page: Help searching Wikipedia

   

How can I get my question answered?

  • Select the section of the desk that best fits the general topic of your question (see the navigation column to the right).
  • Post your question to only one section, providing a short header that gives the topic of your question.
  • Type '~~~~' (that is, four tilde characters) at the end – this signs and dates your contribution so we know who wrote what and when.
  • Don't post personal contact information – it will be removed. Any answers will be provided here.
  • Please be as specific as possible, and include all relevant context – the usefulness of answers may depend on the context.
  • Note:
    • We don't answer (and may remove) questions that require medical diagnosis or legal advice.
    • We don't answer requests for opinions, predictions or debate.
    • We don't do your homework for you, though we'll help you past the stuck point.
    • We don't conduct original research or provide a free source of ideas, but we'll help you find information you need.



How do I answer a question?

Main page: Wikipedia:Reference desk/Guidelines

  • The best answers address the question directly, and back up facts with wikilinks and links to sources. Do not edit others' comments and do not give any medical or legal advice.
See also:


January 10

Memory use by R (programming language)

Is R's memory management less efficient than other languages like Java or Python? The R (programming language) article could use some expansion around this issue. --Hofhof (talk) 12:30, 10 January 2017 (UTC)[reply]

This is a very difficult topic. R is primarily used for complex mathematical calculations. Most people work with vectors and add/remove objects in the vectors many times. R uses malloc to reserve memory. Doing so for each item added to a vector would make R deathly slow. So, R reserves large chunks of memory and then internally manages the memory in the block. To over-simplify it, R requests a big chunk of memory from the operating system, even if it doesn't really need all of it. Then, R reserves/releases memory inside the block itself. The entire block is always reserved by R from the OS perspective. So, you can say that the memory management is better suited for what R does since it is handled by R. You could also claim that the memory management is wasteful since R reserves more memory than it will use. In my opinion, it is not proper to compare memory management in R to memory management in Java or Python because you are comparing two radically different things without even getting into the specifics of exactly how R's memory management works. 209.149.113.5 (talk) 18:28, 10 January 2017 (UTC)[reply]

Websites for asking questions from certain nationalities

For example I'm trying to identify a music video that's probably German or Austrian and I thought maybe I would have a higher chance of people identifying it for me on a website frequented by German or Austrian people. I've found www.gutefrage.net but I'm looking for more.

I'm also looking for a website where I can reach Czech people to identify a probably Czech movie. I've found http://www.csfd.cz/ but it one of those sites where you have to have a certain amount of points before you can post so I can't use it.

Languagesare (talk) 14:35, 10 January 2017 (UTC)[reply]

  • [1] seems to be the de WP equivalent of our reference desk. I cannot read Czech but you can try to find a similar page.
If you meant that you want to ask in English I am out of ideas. (Sweden has this but that is not exactly what you had in mind.) TigraanClick here to contact me 14:57, 10 January 2017 (UTC)[reply]
You could ask, in English, there. Just as we sometimes get foreign language Q's here, I'm sure somebody will be able to translate there. StuRat (talk) 17:27, 10 January 2017 (UTC)[reply]
Thanks, I did that. Languagesare (talk) 19:19, 10 January 2017 (UTC)[reply]

Scroll speed in Chrome vs IE

I'm visiting my father and he's asked me to look at something on his laptop. It's an Asus X540S running Win10.

The issue that he's asked me to look at is that in IE, his two finger scroll on the trackpad will move the page at an acceptable rate. In Chrome however, it is very quick! He, at first, was telling me that it just snaps to the bottom of the page, though it doesn't. It's just so quick that his older eyes couldn't quite catch it. It's even so fast that I would find it annoyingly quick and I have much more dexterity in my fingers than he does.

I think that slowing it down system-wide would be fine but I can't find a way to do even that. Is there a way to slow it down system-wide or more specifically in Chrome? 50.244.0.245 (talk) 17:32, 10 January 2017 (UTC)[reply]

I suggest looking into the Chrome addon "Chromium Wheel Smooth Scroller" or one of the many variants. It adds a lot of scrolling settings - which includes how many lines to scroll and how fast to do it. 209.149.113.5 (talk) 18:21, 10 January 2017 (UTC)[reply]
I've noticed that on Wikipedia that some pages with a large amount of information take a long time in IE before it will do any scrolling whereas Chrome doesn't have that problem. Bubba73 You talkin' to me? 18:23, 10 January 2017 (UTC)[reply]
The Chromium extension doesn't do anything for the trackpad. I'm not seeing other options for the trackpad... Other ideas? 50.244.0.245 (talk) 20:55, 10 January 2017 (UTC)[reply]

double the number of CPUs by an option

This video says that CPU chips have twice the number of advertised cores for spares if some of them quit working. Is this true? Is it safe to do this? What happens if one of the cores dies after you do this? Does this increase the size of the memory cache too or leave it the same? Bubba73 You talkin' to me? 18:21, 10 January 2017 (UTC)[reply]

I have not watched the video. Your description is not accurate. CPUs do not have spare cores laying dormant, waiting for the active core to fail. If you don't know much about computers, you might think that you have double the cores in a CPU by using various tools to inspect the CPU. As an example, I have a rather old computer. It has two quad-core CPUs. That means that it has 4 cores per CPU and 2 CPUs for a total of 8 cores. If I look at the CPUs through the operating system, I see 16 cores, not 8. That implies that there are 8 cores on each CPU, not 4. Of course, there are only 4 cores. The CPUs have hyperthreading enabled. Each physical core shows up as two logical cores to the CPU. I can turn off hyperthreading and see only 8 cores if I want (I don't want to). In the end, you simply need to know what you are doing and what you are looking at. 209.149.113.5 (talk) 18:36, 10 January 2017 (UTC)[reply]
The video description begins: "NOTE - This video is a joke..." Dragons flight (talk) 18:38, 10 January 2017 (UTC)[reply]
Whoops, I didn't see that warning. Bubba73 You talkin' to me? 19:00, 10 January 2017 (UTC) [reply]
Resolved
The important part of user:209.149.113.5's reply is that the 8 logical cores on the real CPU each support hyperthreading, and (to some extent) behave as 16 core. LongHairedFop (talk) 20:07, 10 January 2017 (UTC)[reply]
Yes, I know about that. Three of my current nine desktops have hyperthreading. Bubba73 You talkin' to me? 21:39, 10 January 2017 (UTC)[reply]
But note that some mainframes (e.g. IBM zSeries) do indeed often have more cpus than sold and available to the customer. They can be activated remotely if the customer buys an upgrade. Apparently, it is cheaper to always ship the hardware than to build different versions and occasionally go through a real hardware upgrade. --Stephan Schulz (talk) 22:34, 11 January 2017 (UTC)[reply]
Some CPUs allow you to enable/disable cores, with proper firmware (e.g., BIOS/UEFI) support. Also, CPUs can be sold with things that are physically present but faulty or disabled by the manufacturer. CPUs can be sold with a locked CPU multiplier. Chips that have a fault affecting only part of the chip may be sold with the faulty portion disabled; for instance, if a defect affects only one core, it can be sold as a CPU with fewer cores, rather than thrown away. As mentioned above, this particular video may be a joke, but there's some basis in reality for similar things. --47.138.163.230 (talk) 04:06, 12 January 2017 (UTC)[reply]
Also another thing that came to mind later: fancy, expensive "enterprise" hardware can support all kinds of things, like multiple physical CPUs and hotswapping of CPUs, not found in your average consumer device. Systems where high availability and reliability are required, such as banking and finance, industrial equipment, control systems on vehicles, medical equipment, etc. frequently feature multiply redundant components, including CPUs. In such systems you can often have backup CPUs which are normally idle, there to take over if a primary CPU fails. --47.138.163.230 (talk) 11:30, 12 January 2017 (UTC)[reply]

Folder Locking Software

Require a folder locking software that opens up a security window to enter password whenever I click on the ‘folder’, something like the ‘BitLocker’ drive locking software. Drive locking seems to be impossible (daunting, risky, tried and tired). It will also be good if the same software can lock files, images, video clips, movies and music files, and so on. Anything you guys used, you guys are happy with that I could use too? 103.230.105.26 (talk) 18:46, 10 January 2017 (UTC)[reply]

Here is a list of best encryption software [2]. If you happen to use OS X, there is built-in encryption capability [3]. SemanticMantis (talk) 20:20, 10 January 2017 (UTC)[reply]
Microsoft Office also has a built-in option to password-protect files. - CHAMPION (talk) (contributions) (logs) 03:42, 11 January 2017 (UTC)[reply]
Also, you can password-protect a zip file. - CHAMPION (talk) (contributions) (logs) 03:52, 11 January 2017 (UTC)[reply]

You should take care if using zip. The initial standard encryption is broken and has been for a long time, see Zip (file format)#Encryption. Probably most recent tools will either simply not support this for writing or warn you but it's possible some don't. Newer better methods are supported although as our article mentions it can be a bit messy what tool does what. 7-zip mention in the above link at least (AFAIK) is fairly standard in encryption. You also need to consider if the encryption is able to encrypt the headers including filenames and checksums (well hopefully checksums, since if the format uses hashes people don't have to decrypt your archive to work out what's stored if it's standard and not encrypted).

However the biggest concern with encryption archives but also to some extent, directory or partition like software is you need to take far greater care in not screwing up. It's a pointless for example to encrypt an archive of the sensitive stuff if you earlier stored it unencrypted on the same disk and didn't wipe that after making the encrypted file. Or if you later decrypt the content, perhaps to view it and store this on the same disk. This is less likely with directory or partition like encryption systems since they generally allow the files to be used by the OS without needing to decrypt them separately and likewise to be stored directly encrypted.

I'm assuming content being recovered from the drive/s of the computer isn't a concern this this has never been mentioned before, only from the removable drive but even so there are still risks, which is why whole disk encryption is far better especially if you don't know what you're doing and don't understand the tool. In most cases of course you do have to worry about content being recovered from the disks of the computer since if the removable drive can be stolen so can the computer. Especially if it's a laptop. And in that case you need to encrypt not just the removable drive but the drives of the computer. And even then you should be aware of the limitations, e.g. if someone steals the computer when the decryption key is in memory because it's on and unlocked. Malware. Etc.

Nil Einne (talk) 11:39, 11 January 2017 (UTC)[reply]

January 11

DeepDream and noise

If DeepDream is left trained on the same image for an excessively large number of iterations, does a kind of random noise result? What does that look like? Can you find me an example?  Card Zero  (talk) 18:10, 11 January 2017 (UTC)[reply]

I remember reading, perhaps a year or so ago, that in these circumstances it may produce a vaguely face-like image (though with over-large "eyes"), regardless of the starting point. However, I have absolutely no recollection of when and where (internet, print magazine etc) I came across this. Perhaps my less-than helpful answer will prompt someone else's memory. {The poster formerly known as 87.81.230.195} 2.122.62.241 (talk) 08:37, 14 January 2017 (UTC)[reply]
Lots of things can look "vaguely face-like" - such as the Face on Mars. The phenomenon is known as pareidolia. Mitch Ames (talk) 12:17, 14 January 2017 (UTC)[reply]
In fact, Pareidolia#Computer vision actually refers to "... the DeepDream software, which falsely detects and then exaggerates features such as eyes and faces in any image." Mitch Ames (talk) 12:20, 14 January 2017 (UTC)[reply]
Watching the longest youtube videos (in terms of number of iterations) that I could find, two of them did resolve to a kind of soup of human-like faces dominated by large eyes (which had a lot of blue-green around them, as if belonging to an amalgamation of animals). So I think you're right, this must be how it typically plateaus to a local maximum. Thank you.  Card Zero  (talk) 14:55, 14 January 2017 (UTC)[reply]

"Apple Certified Pre-owned" iPhone

I just bought an iPhone 6S on eBay. It was advertised as new but arrived marked "Apple Certified Pre-owned". What exactly does this mean? Will it be in new condition aesthetically, eg. no scratches? Will it have the same life as a new unit? Etc. Amisom (talk) 22:27, 11 January 2017 (UTC)[reply]

Have you tried google? Vespine (talk) 00:30, 12 January 2017 (UTC)[reply]
Yes. I didn't find answers to either of the two questions I asked above. Thanks @Vespine:. Amisom (talk) 08:35, 12 January 2017 (UTC)[reply]
Not sure why it didn't work for you. When I search for 'Apple Certified Pre-owned scratches' or 'Apple Certified Pre-owned scratch' or 'Apple Certified Preowned scratch' or 'Apple Certified Preowned scratches' or 'Apple Preowned scratch' or 'Apple Preowned scratches' i.e. the logical search terms based on your question I find [4]. As for the second question, these sort of things are generally impossible to answer. I mean some companies have return rates for such devices so have a good idea, but they don't tell it to ordinary consumers. Nil Einne (talk) 01:46, 13 January 2017 (UTC)[reply]

January 12

Any virus can make computer on fire?

I know the email about Torch virus is fake but any virus really can make computer on fire? Include normal computers, laptops, phones and iPads. Not include special computers for factory work or airplane. Not include bad hardware example Samsung Note 7 battery unless if virus can cause the fire. --Curious Cat On Her Last Life (talk) 09:24, 12 January 2017 (UTC)[reply]

Maybe not the computer, but it was at one point possible to set the printer on fire. Mitch Ames (talk) 09:51, 12 January 2017 (UTC)[reply]
Or a virus could call the Halt and Catch Fire instruction. Mitch Ames (talk) 09:53, 12 January 2017 (UTC)[reply]
Not actually catch fire, but it was possible for malicious software to physically damage some older computers with a killer poke. Mitch Ames (talk) 09:54, 12 January 2017 (UTC)[reply]
If the device has a lithium ion battery of some sort which would include most laptops, phones and tablets but not desktop computers, it may be possible to change the firmware of the battery charger to cause overcharging or allow over discharging and then attempt charging. Both of these potentially could cause a fire. See some brief discussion of this possibility here [5]. However this assumes that there is modifiable firmware that allows that and there are no additional unmodifiable safety features to prevent this or at least which will prevent them "venting with flames". In terms of charging, one obvious consideration is whether it's even possible for to output sufficient voltage. This documentation for a Sanyo 18650 cell [6] allow up to 4.5V with a faulty charger. Suggesting while it's possible frequently charging to 4.5V may cause sufficient damage to eventually vent with flames, it should not happen after a single instance. Most other components are unlikely to catch fire or cause something else to catch fire. E.g. even if if you turn off the fan (if any) and heatsinking etc means there's a limit to how hot the CPU and GPU are likely to get before they die. And in practice most modern CPUs and GPUs (at least in desktops/laptops) have thermal throttling in addition to a thermal cutoff which normally stops them killing themselves from overheating alone and which often AFAWK can't be shut off. (Well if you're relying exclusively on the builtin thermal cutoff t's possible if you're doing it several times a day for many days you're eventually going to cause sufficient damage to break it. In addition, you may or may not be able to kill it by overvoltage, especially with a computer designed for overclocking.) I do wonder about the VRMs (as stuff like Furmark etc have shown( but you're more likely to just kill them then cause a fire. Nil Einne (talk) 16:25, 13 January 2017 (UTC)[reply]
I have no idea if this would work (and after an incident in Dubai several years ago, I believe standards on harmonic distortion have been improved), but I've speculated that an alternate plan is to make a virus that uses heavy processing power for half of each 1/60 of a second, synchronized with an external time source; this "harmonic distortion virus", let loose in office buildings and hotels or even homes with many computers, might shift the ground voltage for the entire building's electrical system to something close to the mains voltage and (putatively) burn down whole countries, especially when integrated with the approach above. Wnt (talk) 17:42, 13 January 2017 (UTC)[reply]
Printers can be hacked to catch fire - Scientific American, 2011. This theoretical idea resembles some very early programmer jokes. Note the entries on this list for ETO, KFP and OBU. I read recently (but where?) that this harks back to early printers - was it thermal line printers? - when the mild smouldering of a jammed and unattended printer was a genuine risk.  Card Zero  (talk) 15:10, 14 January 2017 (UTC)[reply]

January 13

Tesla privacy

I was just reading this, which gives me the impression that every charge for every Tesla is tracked from the company's central server, at least where the identity of the car is concerned. It strikes me that this completes an apparent design, since according to some things I read the vehicle apparently collects detailed logs of everything the driver does for about the past 50 hours. [7][8][9]. There is some inconsistency in those: the article about the New York Times gives a strong impression that it was tracking the car's position, while the programmers say that there is no evidence of it storing actual position data... while also saying the file contains encrypted material only Tesla can read. Anyway, I was wondering if you can recommend more data to put this together reliably this goes there, saying the company even gets profiles of how the driver acts on specific roads, but it seems a little hard to follow certain details). My overall impression from the preceding is that by controlling the only power generally available and making electronic connections whenever it is powered, the company knows precisely where and how the "owner" of the car is driving soon enough, but I'd prefer some clearer sourcing before I go on about it in the article. Wnt (talk) 17:34, 13 January 2017 (UTC)[reply]

  • For your first link, if Tesla gives out free charging to its customers, I see no way to ensure that only Tesla-issued cars are plugged in it without going online. You obviously need some sort of identification to avoid a smart guy to come up with an adapter and suck free electrons. An offline challenge-response authentification is all sorts of trouble (can give details if needed), and a huuuge security liability especially if you are going to bill people with it.
For the rest, well... Computer driving being a game of machine learning, Tesla has a legitimate interest in keeping as much driving data as possible. It will actually improve the auto-driving software. It is certainly more legitimate than, say, Google maximizing the profit they get from auctioning your computing power, bandwidth, and human attention to whoever pays the most to execute their arbitrary code in your browser - err, I mean, delivering ads that interest you. If I was in charge, I would make the thing opt-out but with ten steps before reaching the "disable" button, so that I lose data on the 0.1% of civil right activists that use it but also the criticism that comes with it. TigraanClick here to contact me 18:39, 13 January 2017 (UTC)[reply]
Actually I was looking for capabilities rather than explanations, since the latter seem pretty simple to come up with post hoc and don't really mean a lot. Wnt (talk) 22:01, 13 January 2017 (UTC)[reply]
Nitpick: on the net, you aren't really taking Tesla's electrons when you charge your car at Tesla's station... you're taking their energy, because for every electron that flows into your battery as elecrtric current, a replacement electron flows back out. In other words, even as you "charge" a battery, you are completing an electric circuit. The battery doesn't end a charge cycle with more electrons than it starts with. Rather, it finishes with the same number of net electrons, chemically emplaced into a higher energy state. The clever user who seeks to steal electrons will find himself carrying an electrostatic charge; the effects of the coulomb force would be macroscopic for any thief who tried to take even a few milligrams of electrons. They'd probably be unable to depart the charging station, even using a powerful high-torque electric motor like the one found on a Tesla. WP:OR - Shall I provide a link to these calculations on Wolfram Alpha? Nimur (talk) 22:37, 13 January 2017 (UTC)[reply]
Nimur, I read the Coulomb force article, but I'm unclear — it looks like the force would either be immensely repulsive (i.e. the thief would get injured so badly he'd be unable to leave) or immensely attractive, but I didn't understand the article well enough to know which of these would be true. Or perhaps something else would happen (e.g. a massive static electricity shock, rendering the thief badly injured); which, if any, of these is what you meant? Nyttend (talk) 01:19, 14 January 2017 (UTC)[reply]
Ultimately, you don't "steal electrons" when you charge a battery - so, by default "none of the above" - but for whichever non-physical hypothetical scenario you could imagine, a different problem would arise! If you tried to store the electrons somewhere, they'd repel each other (with a lot of force); if you tried to take them somewhere, that would require separation of charge... and so on. Each of these physical phenomena would cause trouble for any macroscopic movement of charge. Nimur (talk) 18:37, 15 January 2017 (UTC)[reply]
Nitpicks are always welcome on the RefDesk, but colloquially using the term 'electrons' to stand for electricity and/or electronic data has a long history on the internet. Matt Deres (talk) 14:11, 15 January 2017 (UTC)[reply]
Indeed; the colloquial plain-English usage of the word "charge", as it pertains to the normal procedure for an electric battery, is a troublesome word; because "electric charge" has a very specific meaning in physics. In reality, we often colloquially use and abuse the terminology for electric current, voltage, electrical energy, electrical power, electrical charge... and then, we additionally complicate things with dynamic processes including electromagnetic waves... Point taken, though! In colloquial speech, all of these terms are used much more freely! Nimur (talk) 18:37, 15 January 2017 (UTC)[reply]
You can look at the analysis of logs which were publicly released in this dispute [10] [11] [12]. It doesn't include any position information but that may have just been because it wasn't relevant to the dispute. Also this was a while ago. Note that although this car was being test-driven, I've never seen any suggestion the logging wouldn't have happened normally although I believe unsurprisingly the journalist was required to allow Tesla to retrieve the logs. Note that I don't actually see anything in those logs that probably isn't required in most fancy cars and probably non fancy ones, with an Electronic control unit obtainable via OBD2 other than the charging stuff. Nil Einne (talk) 04:33, 14 January 2017 (UTC)[reply]
@Nil Einne: I appreciate that all cars were, rather stealthily as I recall, set up with electronic "black boxes" that later started turning up in court proceedings for reckless driving and such. However, those boxes in many cases --- I think --- have remained inaccessible to routine daily monitoring by higher authorities. The question here is whether Tesla does, as that one source suggests, have access to the entire log each time the vehicle is charged, or under some other circumstance, and what that log contains. Wnt (talk) 16:22, 14 January 2017 (UTC)[reply]

Ambiguity and programming languages

Could a programming language be ambiguous under any aspect? Can we be 100% that a language is not ambiguous, not matter what you express in this language? — Preceding unsigned comment added by 31.4.136.155 (talk) 23:03, 13 January 2017 (UTC)[reply]

In many programming languages it is possible to set up a race condition, the outcome of which is not predictable. Jc3s5h (talk) 23:08, 13 January 2017 (UTC)[reply]
Also, many languages explicitly allow for constructs that have undefined behaviour (in C, e.g. comparing two arbitrary pointers, or the order of execution between sequence points, or, IIRC, dereferencing the NULL pointer). Such ambiguities are often trouble for portability, because different implementations may implement them differently. --Stephan Schulz (talk) 23:22, 13 January 2017 (UTC)[reply]
Undefined behavior is a huge issue, and I think it counts as "ambiguous".
I like to think of three kinds of programs: the good, the bad, and the ugly. Good programs are correct, and work. Bad programs have an error which the compiler catches. But ugly programs contain undefined behavior, which the compiler might catch or not, and which if the compiler doesn't catch might do what you want or not, or might do different things depending on the phase of the moon.
In a programming language, you can think of undefined behavior as a "gulf" between the good constructs and the bad constructs. You can imagine that the wider that gulf of undefinedness is, the more risky (or at least tricky) programming in that language is going to be.
I like C a lot -- it's still my favorite programming language -- but I have to admit, its gulf of undefinedness is uncomfortably wide.
(I believe I've heard that a design goal of Java was to eliminate undefined behavior entirely, i.e. to shrink the gulf of undefinedness down to nothing.) —Steve Summit (talk) 13:34, 14 January 2017 (UTC)[reply]

January 14

Intel 80186 manufacturing

Our article on the Intel 80186 says that production of this chip began in 1982 and continued until 2007. Aside from replacements of older hardware (especially for the embedded system for which the 186 was largely used), what, if any, market would there have been for the 186 by this time? I can't imagine a reason to do anything except replace old parts (and even then, why use a quarter-century-old design when you can upgrade to something much newer and better supported; museum/archival needs would be a tiny fringe of the market) with such an old chip design. Nyttend (talk) 01:15, 14 January 2017 (UTC)[reply]

Lots of military hardware used the MQ80186-6/B, and had very long production runs. If an embedded system does the job you designed it to do, why redesign with a "newer and better" chip. Also few of the more modern chips come in ceramic and glass packages. --Guy Macon (talk) 03:47, 14 January 2017 (UTC)[reply]
Which also explains why space-faring devices are powered by a chip found in 3 generation old products. Clubjustin Talkosphere 05:49, 14 January 2017 (UTC)[reply]
Displaying my ignorance here — if you put in the newer and better chip, even one with a ceramic and glass package AND one that fits with the connections (the screws, or whatever attach it to the rest of the board, will keep it from falling off), would that require retooling a bunch of the other hardware? I was imagining that the newer and better chip (at least a later generation of an Intel chip in this line), as long as it could be attached to the board securely, would be compatible. Nyttend (talk) 12:23, 14 January 2017 (UTC)[reply]
[ec] Nyttend. there are newer, faster drop in replacements for some chips. One example is the Dallas/Maxim DS89C420 which drops in to a standard 8051 socket and runs the existing 8051 software, but twelve times faster with the original clock crystal and fifty times faster with a crystal change.[13][14]
The 80C186EA is a newer, faster drop in replacement for the 80C186. [15] --Guy Macon (talk) 16:51, 14 January 2017 (UTC)[reply]
No, that's very much not the case. You could, of course, make a "better" 80186 now - you could probably even get a Raspberry Pi running software to emulate an 80186 in real time. But real newer chips use newer designs, and they are not usually fully compatible. There usually is some backwards compatibility engineered into newer chips, but that is not holding up over 30 years. You e.g. need different firmware for low-level interaction even if user programs and most of the OS can remain unchanged. You also need different voltages. The 80186 ran on 5V system power. I have a harder time reading the i7 data sheet, but it looks like the maximum supported voltage is 1.6V [16]. --Stephan Schulz (talk) 12:55, 14 January 2017 (UTC)[reply]
This thread is a very good demonstration of why I use software and hardware without attempting to modify either one of them...Thanks! Nyttend (talk) 13:00, 14 January 2017 (UTC)[reply]
Yes, an enormous amount of "legacy" hardware and software is in everyday active use. The world's financial systems run largely on decades-old COBOL software running on z/OS, which maintains backwards compatibility back to the 1960s. (The software is generally maintained and updated as necessary, but it isn't rewritten from scratch.) U.S. nuclear power plants are run by PDP-11s. CNC and SCADA systems running MS-DOS or other old software are everywhere. --47.138.163.230 (talk) 13:52, 14 January 2017 (UTC)[reply]
Chips aren'tn screwed onto a board, but soldered with their connection pins. A newer chip is not compatible, unless it has exactly the same functionality and pin layout. Using a newer and "better" chip means redesign of the entire system: specifications, peripheral electronics, system board, software, etc. And then testing all of it. The cost would be huge. If the old chip is still available and it does the job, there would be no reason to go through this all. That's precisely the reason why some popular chips are kept in production for such a long time (and often at low cost). Hope this explains a bit. :-) Jahoe (talk) 13:17, 14 January 2017 (UTC)[reply]
Parts obsolescence can be a huge problem. The manufacturer always claims that the "new and improved version" is also a "drop-in replacement". The problem is that it can be arbitrarily difficult to prove that the drop-in replacement actually meets all your requirements, including the requirements you forgot to document, or didn't even realize you were depending on.
The other big problem is testing. The longer a system has been running, the more likely it is that over the years, you fixed a bug or added a feature but forgot to write it down, and forgot to add a test for it to your list of test cases. So for a big, complicated, old system, testing it thoroughly (to make sure it does everything it's expected to, even after making some significant change like swapping out the CPU for a "new and improved" one) isn't just timeconsuming and expensive, it can be downright impossible. —Steve Summit (talk) 14:18, 14 January 2017 (UTC)[reply]
I see mention of space and military hardware above. I assume this has something to do with a relative vulnerability of high-resolution circuits to radiation, though I don't pretend to know the details. Wnt (talk) 13:57, 15 January 2017 (UTC)[reply]
We have an article on that: Radiation hardening. --Guy Macon (talk) 19:55, 15 January 2017 (UTC)[reply]

January 15

Modular arithmetic with limited integer range

Let's say I'm using a programming language where integers range from to (inclusive) (for example, -2^31 to 2^31 - 1), and , , etc. What algorithm can I use to compute modulo , where ? Obviously the result will be expressible within the range of integers. Assume I don't have access to any larger integer type and values can't be promoted to a larger type at any point of the algorithm. 24.255.17.182 (talk) 21:06, 15 January 2017 (UTC)[reply]