Talk:Computational lithography
This article has not yet been rated on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||
|
Good topic. Happy to see a timely topic get good devoted coverage. Guiding light (talk) 14:24, 17 December 2008 (UTC)
Too much jargon
User:Politizer, thanks for your edits and inputs. As an aside, I see that just as you were deleting the "Mentor Graphics uses Cell Engine" as too promotional, I was adding the "Brion Technologies makes a custom accelerator" bit. You probably think I'm messing with you. Then when I go visit your talk page I see you've been forced to put up some sort of poppy field image because you've exceeded your wikistress limit. Sorry about that, I'm not trying to stress you out. Let me suggest a solution here. I think you'll agree that the article benefits from some sort of discussion of how the immense computational problem will be tackled. I refered to specific companies and products (btw, none of which I have any association) in the spirit of verifiability. Maybe if we pushed the companies into footnotes and something along the lines of: "Some companies have resorted to advanced multi-CPU processor architectures[1] while others are creating rack-based systems to handle the immense computational load[2]", etc. etc., you will be able to stop eating those poppies (which really are bad for your health).
The main point that you bring up -- too much jargon -- will take a bit of skill to fix. I need your help. As an expert in electronics I may not be the best judge of where to draw the line. Remembering that this is a technical topic, please give me some feedback: looking at each of the major sections of the article (intro paragraph, historical context, eye-crossing details, and end section on 100+ years of computing) -- which do you find to have too much jargon? All? Only the middle section? I'm asking from the perspective of a general article, not necessarily a DYK winner. Based on your inputs I'll see what I can do to simplify the jargon. bill devanney (talk) 06:52, 9 December 2008 (UTC)
- Haha, don't worry about the wikistress; that was from a totally different issue that was going on at the time with another user. Your edits here aren't stressing me out at all :-).
- As for how we can clean this up, I don't have time to read through the article right now, but I'll try to take a look tonight when I'm free. Thanks for your message! —Politizer talk/contribs 17:24, 9 December 2008 (UTC)
- Hm...I have tried to do some reading and I can't even really tackle the first paragraph. I'm simply not familiar with the transition that sparked the emergence of CL, so my brain can't even start to try to think about the rest. Here is my proposal for how we can work on making the language clearer.... send me an e-mail or talk page message just trying to explain/teach me the background and significance of computational lithography, as if I were an average person or even a high school kid or something. Don't worry about how long or short it is or how well-written, just write something with the assumption that I know nothing about lithography or any of this kind of technology (so, use a lot of appositives and descriptors—after new terms, try adding a comma and some brief description/explanation). That should serve as both a crash course for me so that I know what I'm doing, and—more importantly, I think, both for this article and for our future WP work—an exercise in writing for the uninitiated/naive audience.
- Let me know if this idea sounds acceptable to you! Thanks, —Politizer talk/contribs 01:01, 10 December 2008 (UTC)
I've been offline for a few days but thinking quite a bit about your suggestion. I'm back. I like the gist of what you're proposing except I think I prefer to do it through the article itself rather than a separate effort. I don't want to duplicate the effort and I'm not a genius enough to write clear prose effortlessly. (Plus, perhaps along the way the magic of wikipedia will kick in and other parties will join it to make it better than ever).
My plan on improving things:
- keep the 1st paragraph as a relatively concise summary, even if its a bit incomprehensible for the non-science types -- if it's a complete miss compared to what the reader was looking to learn, at least they know that right away. If they're interested they can invest in reading more.
- a brief historical note on Moore's Law which describes the amazing trend of miniaturization in microelectronics
- Then introduce the big change to the article compared to the current version: explain more about the core problem we're up against: the blurriness introduced by optical diffraction. That's what makes it practically impossible to shine a clean little 30 x 30 nm square of light on a surface using a 193 nm wavelength light source. If the person groks diffraction they are on their way to understanding the problem that is (partially) solved by computational lithography.
- dive into the gory details — which can certainly be improved although there's a limit to how simple it can be made
I plan to do it in pieces, so don't be alarmed if I make a change and then sign off. I'm busy with other things right now but I do want to work on this.
— bill devanney (talk) 07:12, 17 December 2008 (UTC)
- That sounds like a good plan. I agree with you that the main thing the article needs is an introduction to the underlying problem that computational lithography is meant to deal with. —Politizer talk/contribs 14:31, 17 December 2008 (UTC)
Reason needed for the massive calculations
The article should explain why the massive calculations are needed, as compared with simply shining a coherent light source through the mask to obtain the actual full diffraction pattern, then performing a spatial Fourier transform on this pattern to obtain the frequency domain, isolating the higher frequencies, and inverse-transforming them to obtain the "anti-diffraction effect" modifications. While this is not quite sufficient, due to the need to reduce the frequencies of the corrections to the point where they can be printed, I imagine someone has developed a simple method like this for computing Fresnel-zone corrections that would take only a second or two of computing time. If so, shouldn't it be reported here, for balance? If not, shouldn't it be explained why such simple computations are insufficient? David Spector (talk) 13:22, 3 July 2011 (UTC)