Wikipedia:Articles for deletion/Zime
Tools
Actions
General
Print/export
In other projects
Appearance
From Wikipedia, the free encyclopedia
- The following discussion is an archived debate of the proposed deletion of the article below. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review). No further edits should be made to this page.
The result was delete, including the images. Sandstein 09:53, 24 December 2006 (UTC)[reply]
Apparently implausible object. From the article:
- Universal ZIME Belt (Josta) consists of 18.4 trillions single ZIMEs. More than 400.000 persons and concepts are located on Josta sofar. Each ZIME is able to store unlimited amount of infomation while not growing bigger or more complex. Using today's computer technologies it is easy to encode information in ZIME but it requires a future Quantum Computer to decode it. This is why ZIME is often refered to as a Quantum Harddisc.
Has anyone any independent proof that this exists, as described in the article? If not, I suggest deletion as unverifiable. The Anome 16:56, 16 December 2006 (UTC)[reply]
- Update: if this AfD results in deletion, it would make sense to delete Universal Josta as well. It might also make sense to look at the relationship, if any, between User:Strazds, and User:Turdus, and their respective contributions to the Armands Strazds article. -- The Anome 00:15, 17 December 2006 (UTC)[reply]
- Comment - I don't have a clue what it is (WP:HOLE), but it was suggested at Wikipedia:Articles for creation/2006-11-19. Note to the closing admin: if the article is deleted, please take care of the massive number of images that were uploaded for the article. I'm not even going to try and guess what a Zime stone is. (Image:Zime0 30.gif, Image:Zime1 30.gif, etc). --BigDT 17:25, 16 December 2006 (UTC)[reply]
- Keep - (1) A proof of 18.4 trillions ZIMES is purely mathematical: 16 ^ 16 and it roots in a graphical structure of ZIME. Every ZIME consists of 16 elements and each of them can carry 4 bits of information (0..16). That gives us 16 ^ 16 = 18446744073709551616 or 18.4 trillions. (2) The statistics of the EXPO project (www.zime.de) alone show that there are more than 300.000 participants of the project. Other round 100.000 participants are from other projects of ZIME Foundation (e.g. 'Your ZIME in Belt of Latvia' with more than 40.000 ZIMEs). (3) ZIME is encrypted using so called one-way-hash algorithm. It is a matematical concept allowing very fast encryption, but very slow decryption procedure that can even take years with the most powerful supercomputers. Other examples of one-way-hash algorithms include so called message digest algorithms (md5, sha, etc.). Quantum Harddisc is a term coined by physicists like David Deutsch to describe a storage medium that can be used only with the hypotetical Quantum Computer but not with the todays computers. --Turdus 20:51, 16 December 2006 (UTC)[reply]
- Delete as hand-waving nonsense. "Each ZIME is able to store unlimited amount of infomation while not growing bigger or more complex." is not plausible if you want to recover the 'stored' data. (OTOH, a hash function does indeed have that property, but it's not designed to store data. Input from a Real Cryptographer would be helpful here. -- Bpmullins | Talk 19:37, 16 December 2006 (UTC)[reply]
Keep- As a Real Cryptographer I can only confirm the mathematical possibility to recover data hashed by one-way algorithms. The most common approach is so called Brute Force Attack (BFA) i.e. searching through the all possible existing data combinations. With ZIME it works this way: any amount of data can be hashed producing a Hex-16 digest. Digest is visualized as ZIME. This digest can be obtained reversly from any ZIME any time. Then the BFA is used to decode the original data. -- Turdus 21:50, 16 December 2006 (UTC)[reply]- Comment - You're claiming to be able to recover a file from its MD5 digest? I don't think so. What makes your digest method any different from MD5 or SHA1? --- Bpmullins | Talk 22:27, 16 December 2006 (UTC)[reply]
- Comment - pls read once more and carefully what I'm claiming. I never claimed *I* could do that! But a mathematical possibility is with no doubt there as I already explained. --Turdus 01:07, 17 December 2006 (UTC)[reply]
- Comment: no, you can't, and no such mathematical possibility exists. Consider a 128-bit MD5 hash value: there are only 2^128 possible such values. Now consider a 256-byte file: there are 2^2048 possible such files. A simple counting argument shows that it is not possible to uniquely recover input files from their digests, even with a quantum computer, because the files and digests cannot be put into one-to-one correspondence. -- The Anome 23:23, 16 December 2006 (UTC)[reply]
- Comment - you are right about one-to-one correspondence. But remember, that we are searching for a meaningfull texts. This limits possibilities dramatically! --Turdus 01:45, 17 December 2006 (UTC)[reply]
- I was expecting you to raise that objection. Let's look at it in detail. "Meaningful" natural-language text contains just over one bit of entropy per byte (see [1]), and so, even if you restricted the 256-byte file to "meaningful" text, and conservatively assumed one bit per byte of entropy, there would still be 2^256 such files. The counting argument still applies. Please look up "unicity distance". -- The Anome 00:02, 17 December 2006 (UTC)[reply]
- ZIME is coded using a very strictly standardized data structure. How do you think, how many of these 2^256 files can have such a structure? --Turdus 02:18, 17 December 2006 (UTC)[reply]
- You are proposing that this is a universal lossless compression algorithm, then, with some input pre-expansion created by the "very strictly standardized data structure". Please see the lossless compression article to see why that is impossible. -- The Anome 00:29, 17 December 2006 (UTC)[reply]
- I have never said it were "universal lossless compression algorithm". I said: its a digest. There is a difference as you may notice. --Turdus 02:37, 17 December 2006 (UTC)[reply]
- If you can invert a short digest into source files drawn from a pool with size larger than the number of possible states of the digest, you have produced an effective lossless compression algorithm. Or are you withdrawing your earlier assertions about invertibility? -- The Anome 00:51, 17 December 2006 (UTC)[reply]
- Its a pity that I have to vaste my time because some people dont want to read carefully enough and generate spam-like comments! Everything you say is true for the PC on your desk. But I said clearly: quantum computer is a solution. And nothing else. --Turdus 03:07, 17 December 2006 (UTC)[reply]
- Quantum computers cannot overcome counting arguments: even if it's a magic box (which quantum computers are not), if it can't get any more information than is encoded in the digest, it cannot possibly map its restricted set of inputs to the much larger set of outputs that must be generated if the digest is to be inverted. -- The Anome 01:11, 17 December 2006 (UTC)[reply]
- I am glad you found your own theory about quantum computing. In this case I naturally cannot oppose - every theory has its right to live and to die! Good luck with further opposing ZIME! May be you can get it out of WikiPedia (who cares!), but not from the minds of the creative people of the world! Best regards! --Turdus 03:24, 17 December 2006 (UTC)[reply]
- Counting arguments based on the pigeonhole principle aren't new, so I can't take credit for them. However, they work rather well on claims of infinite compression using message digests. -- The Anome 02:15, 17 December 2006 (UTC)[reply]
- This is true only for meaningless information. But can you imagine that a good message digest algorithm will produce the same output for the strings "The pigeonhole principle is an example of a counting argument" and "Hilbert's Grand Hotel can accommodate more guests without doubling up on rooms"? --Turdus 13:15, 17 December 2006 (UTC)[reply]
- Please re-read my comment above containing the words "unicity distance". And, while you're here, you might want to quote the section of Deutsch's The Fabric of Reality that you claimed supports your assertion that quantum computers can invert message digests, but have so far been unable to supply. -- The Anome 12:29, 17 December 2006 (UTC)[reply]
- (1) Let me help you: the only question that really matters is: are there substantially more messages encoded into ZIME than a biggest possible digest hex number: FFFFFFFFFFFFFFFF. The answer is no. And are not expected to be. And if - we just choose for ZIME2 a longer digest, kind of DoubleZIME. (2) Quantum computers, as described in "The Fabric of Reality" will be able to perform any computing operations in a reasonable amount of time (in fact at almost no time!). If we put this in the context of ZIME, it becomes obvious that a term 'Quantum harddisc' is a good one. --Turdus 16:18, 17 December 2006 (UTC)[reply]
- That's not consistent with your earlier claim of unbounded information storage. The important question is not how many messages you expect to encode during the lifetime of the system (which will obviously be less that 2^128) but the space of possible messages. Let's take your comment above as an example. It is 570 characters long, and thus has an information entropy in excess of 500 bits. It is almost certain that another "meaningful" message of similar length (indeed, billions upon billions of such messages) will have the same MD5 digest, and that it is therefore impossible to uniquely recover that message from its MD5 digest. If we widen our scope to include the entire set of meaningful messages of information entropy 500 bits, the "almost certain" becomes an absolute mathematical certainty. This applies regardless of whether you have a conventional computer and an unbounded length of time for brute forcing the message, a hypothetical quantum computer of performing any computable task, or even access to a hypothetical hypercomputer. I have said this before, in several different ways. Putting the members of a larger set in one-to-one correspondence with a smaller set, which is what your compression scheme amounts to, is simply impossible. Even if you have a quantum computer. -- The Anome 11:53, 18 December 2006 (UTC)[reply]
- (1) By now we have encoded near 400.000 ZIMES - each of them unique. Is the "unicity distance" big enough? I really do not insist to leave the theoretical claim of Quantum Harddisc in this article if its too controversial. But I think its worth mentioning it in some form. And: we have tested ZIME digest (DWD) against MD5 with the result that DWD produced much less redundant outcomes. E.g. we encoded 2.3 Mio different data sets with DWD and every single digest was unique. (2) A ZIME Quantum Harddisc theory is strongly supported also by the prominent physicist Prof. Dr. Andris Buikis, who is also a member of ZIME Foundation. (3) Even more important than decoding I regard the verification functionality i.e. only the person who has produced a certain ZIME can reproduce it again. --Turdus 02:13, 19 December 2006 (UTC)[reply]
- The question is not whether you can digest a set of a few million distinct messages and not have collisions (you could, for example, reasonably expect to digest about 2^64 randomly generated messages before seeing an MD5 collision -- see the birthday paradox to understand the reason for this particular number), but whether you can do this to the much larger set of all possible candidate messages. The difference in size between these two spaces is sufficiently big to make them qualitatively different: 2^512, for example, is 13,407,807,929,942,597,099,574,024,998,205,846,127,479,365,820,592,393,377,723,561,443,721,764,030,073,546,976,801,874,298,166,903,427,690,031,858,186,486,050,853,753,882,811,946,569,946,433,649,006,084,096, which is somewhat larger than 2.3 million. One is sufficiently large to overwhelm the message digest algorithm, the other one isn't. And this is only for 512-character natural-language messages: hardly "unlimited" storage. -- The Anome 08:51, 20 December 2006 (UTC)[reply]
- Counting arguments based on the pigeonhole principle aren't new, so I can't take credit for them. However, they work rather well on claims of infinite compression using message digests. -- The Anome 02:15, 17 December 2006 (UTC)[reply]
- I have never said it were "universal lossless compression algorithm". I said: its a digest. There is a difference as you may notice. --Turdus 02:37, 17 December 2006 (UTC)[reply]
- You are proposing that this is a universal lossless compression algorithm, then, with some input pre-expansion created by the "very strictly standardized data structure". Please see the lossless compression article to see why that is impossible. -- The Anome 00:29, 17 December 2006 (UTC)[reply]
- ZIME is coded using a very strictly standardized data structure. How do you think, how many of these 2^256 files can have such a structure? --Turdus 02:18, 17 December 2006 (UTC)[reply]
- I was expecting you to raise that objection. Let's look at it in detail. "Meaningful" natural-language text contains just over one bit of entropy per byte (see [1]), and so, even if you restricted the 256-byte file to "meaningful" text, and conservatively assumed one bit per byte of entropy, there would still be 2^256 such files. The counting argument still applies. Please look up "unicity distance". -- The Anome 00:02, 17 December 2006 (UTC)[reply]
- Comment - you are right about one-to-one correspondence. But remember, that we are searching for a meaningfull texts. This limits possibilities dramatically! --Turdus 01:45, 17 December 2006 (UTC)[reply]
- Comment: no, you can't, and no such mathematical possibility exists. Consider a 128-bit MD5 hash value: there are only 2^128 possible such values. Now consider a 256-byte file: there are 2^2048 possible such files. A simple counting argument shows that it is not possible to uniquely recover input files from their digests, even with a quantum computer, because the files and digests cannot be put into one-to-one correspondence. -- The Anome 23:23, 16 December 2006 (UTC)[reply]
- Comment - pls read once more and carefully what I'm claiming. I never claimed *I* could do that! But a mathematical possibility is with no doubt there as I already explained. --Turdus 01:07, 17 December 2006 (UTC)[reply]
- Comment - You're claiming to be able to recover a file from its MD5 digest? I don't think so. What makes your digest method any different from MD5 or SHA1? --- Bpmullins | Talk 22:27, 16 December 2006 (UTC)[reply]
- Delete as original research or non-notable Latvian divination or cryptographic method. Lacks multiple independent verifiable sources. Edison 21:36, 16 December 2006 (UTC)[reply]
Strong Keep- according to Edison (what a vanity!) the central EXPO 2000 project of Latvia is not worthy to be on WikiPedia??? "Non-notable Latvian divination" - somebody, pls send this guy to the WikiDesert :) --Turdus 00:02, 17 December 2006 (UTC)[reply]- This isn't strictly a vote, but please note that it's still bad form to say "keep" three times. Traditionally, on subsequent comments, you say comment instead of repeating your position as to not be confusing. BigDT 22:27, 16 December 2006 (UTC)[reply]
- Delete - I'm still trying to decide if zime is a cult or an algorithm. BigDT 22:27, 16 December 2006 (UTC)[reply]
Comment & Speedy Keep :)- dear BigDT, zime is both and much more! Just let me complete the article! BTW have you already got your own ZIME? --Turdus 00:38, 17 December 2006 (UTC)[reply]
- Comment: I've struck through three duplicate "keeps" by Turdus, who is also the original article poster. I was wondering why there were so many "keep" comments for this. Remember, AfD is not a vote: and even if it was, voting multiple times (four, at the latest count) is unlikely to persuade others of the merits of your argument: and, again, if it was a vote, after removing the duplicates, I make it four "deletes" against one "keep" so far, with the one "keep" coming from the original author of the article. -- The Anome 23:26, 16 December 2006 (UTC)[reply]
- Comment - Dear Anome, its a really bad style to strike through other users comments. I like big bubbles because they use to burst in so many funny ways! :) And: one good KEEP against 1000 stupid DELETES must win in a really good WikiPedia! --Turdus 01:41, 17 December 2006 (UTC)[reply]
- Perhaps you would like to remove your duplicate "keep" comments yourself, rather than "vote" multiple times? -- The Anome 23:55, 16 December 2006 (UTC)[reply]
- Comment - No, thanks. If I vote many times I really mean it! BTW AfD strictly speaking is not a vote, as somebody already correctly said (oh, that was you!). --Turdus 02:04, 17 December 2006 (UTC)[reply]
- Perhaps you would like to remove your duplicate "keep" comments yourself, rather than "vote" multiple times? -- The Anome 23:55, 16 December 2006 (UTC)[reply]
- Comment - Dear Anome, its a really bad style to strike through other users comments. I like big bubbles because they use to burst in so many funny ways! :) And: one good KEEP against 1000 stupid DELETES must win in a really good WikiPedia! --Turdus 01:41, 17 December 2006 (UTC)[reply]
- Delete as not verifiable. No notable independent sources exist (i.e., other than the web sites created by the "founders" of ZIME there are no other reviews / references of it) QuiteUnusual 00:14, 17 December 2006 (UTC)[reply]
- Comment - What exactly is not verifiable for you? That the EXPO 2000 took place? That Latvia participated with ZIME project? Or you believe nothing that you cannot touch? --Turdus 02:32, 17 December 2006 (UTC)[reply]
- Comment What is not verifiable is ZIME. Feel free to write an article about EXPO 2000 if you want. The fact that the EXPO took place is an irrelevance to the verifibility of the claims about something that was showcased there QuiteUnusual 11:07, 17 December 2006 (UTC)[reply]
- Comment If you can demonstrate this, you could probably justify a one-sentence mention of it in the EXPO 2000 article, without the impossible claims of infinite information storage. (Oh, look, there is one.) If you can't demonstrate this through citing independent, reliable sources, then this article is an automatic delete, together with that sentence. Please see WP:V and WP:OR. -- The Anome 00:47, 17 December 2006 (UTC)[reply]
- Comment - Is the book "The Fabric of Reality" by the Oxford professor David Deutsch reliable enough source for you or it also contains "impossible claims"? --Turdus 02:58, 17 December 2006 (UTC)[reply]
- Go on then, tell us where David Deutsch tells us a quantum computer can invert a message digest. -- The Anome 01:17, 17 December 2006 (UTC)[reply]
- Read the book - its really worth it! And I hope you will discover much more then a simple method of inverting a message digest. --Turdus 03:30, 17 December 2006 (UTC)[reply]
- I've read it. I find it interesting that you think I haven't. I certainly can't remember your project being mentioned anywhere in it. So, to reiterate, can you show me where David Deutsch tells us a quantum computer can uniquely invert a message digest? -- The Anome 01:38, 17 December 2006 (UTC)[reply]
- Please see my comment about the book above. --Turdus 18:03, 17 December 2006 (UTC)[reply]
- Please see my reply above. Claims about the computational power of quantum computers are irrelevant to the counting argument, which holds even if you have a magic box instead of a computer, provided that the magic box gives a consistent output for each input. -- The Anome 12:13, 18 December 2006 (UTC)[reply]
- Go on then, tell us where David Deutsch tells us a quantum computer can invert a message digest. -- The Anome 01:17, 17 December 2006 (UTC)[reply]
- Comment - Is the book "The Fabric of Reality" by the Oxford professor David Deutsch reliable enough source for you or it also contains "impossible claims"? --Turdus 02:58, 17 December 2006 (UTC)[reply]
- Comment - What exactly is not verifiable for you? That the EXPO 2000 took place? That Latvia participated with ZIME project? Or you believe nothing that you cannot touch? --Turdus 02:32, 17 December 2006 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review). No further edits should be made to this page.