Shannon V. OKeets
Posts: 22095
Joined: 5/19/2005 From: Honolulu, Hawaii Status: offline
|
quote:
ORIGINAL: Zap After reading all the posts would it be nieve of me to say that a college AI coding (class or course) might help coders. The class could specifically deal with computer wargames coding? The How to's and what not's to do. Or, as it has been stated, too many developments, too fast would make this almost ineffective. I, without a degree, worked professionally in AI for several years (1988 - 1990) and we developed a fairly effective system. Just before I left our group had grown to 32 (up from 5) and was working on about a dozen different projects. Our system was to take a problem that the manufacturing company I worked for had, and then try to find something in university-published technical papers that could be brought to bear on it. Basically the idea was to take concept papers done on trivial little problems and turn them into practical applications. This paid off very well for our company, saving millions of dollars by finding software solutions to a very diverse set of problems. I mention this as a way that universities and corporations together can advance the understanding of how to make things work. The fundamental principles come from the university. The application to the real world problems requires getting one's hands very dirty in practicalities, which is a poor use of the university professors' talents. A reasonable analogy is the working realtionship between an architect and an engineer. Both are needed to build things and neither is very good at the other one's job. I have less enthusiasm for the universities' ability to train professionals. This is especailly hard to do in a developing science. At the earliest stage, a new concept is very raw, with new words being created just so the people studying/thinking about it can talk to each other. That vocabulary evolves over time and pertains to the basic elements of the concept. As some knowledge is gained, the most promising branches are explored in more detail. Many of those branches yield no fruit. If the potential benefit/payoff is large enough (e.g., genetics) then a separate industry arises whose sole purpose is to facilitate getting a better understanding how everything works - both together and separately. Ultimately a new industry is born that brings the original concept into a practical reality. Ideally, the new industry doesn't polute the world too badly. I would estimate that most of the practical application of new technology is in the hands of individuals outside the university system. These people become professionals through their dedication to one subject over their lifetimes. It is only after an industry has existed for a very long time that professional organizations arise with government (or quasi-government) certification in the field. Universities can then offer courses whose syllabus focuses on certification. Different subfields of AI are at different stages of this development process. Today, robotics are standard in manufacturing automobiles. That was not true in 1975. Google uses search techniques that didn't exist in 1975, and aggressively seeks to improve the ones they have. These are not exactly mature industries, but they are no longer wet behind the ears. Other examples from the last 30 years are graphical user interfaces in general, methodologies for rendering 3D graphics on a 2D surface, computer animation, microchip design, ... the list is long. My wife learned to code while working at a bank. She swears that it was the best system she has ever seen - and I believe her. The bank had experienced programmers mentor those who were learning to program. The classes focused on writing actual code for typical problems the bank had. This was in Cobol for business applications and IBM assembler for systems applications. Every student wrote code that was critiqued by an experinced programmer, line by line. There were standards for documentation, naming variables, and a host of other programming basics. Failure to follow standards was failure. The result was that in a couple of years the bank created dozens of very skilled programmers. They were the first bank to introduce automated teller machines in the early 1980's. Most of the programmers they created went on to senior management positions in IT. Ideas from the universities is almost essential. People in industry who are trying to make things work every day rarely have time to explore crazy new ideas. They lack the social environment where they can kick those crazy ideas around and separate the good from the bad. They are under trade secrets restrictions and can't publish their best papers for rigorous examination by their peers. On the other hand, the university types don't grasp the realities of missing variables, or the introduction of another 50 additional variables, or the implications of expanding a test case from 6 to 47,000,000,000. Only in industry can the benefit be compared to the cost of implementation. Industry will pare down a large general idea to the bare bones it needs to accomplish a specific task. And so there is a disconnect between what the university teaches and what new industries need. In the late 1980's, AI had the curious situation (may still have for all I know) that graduates with a degree in AI were sucked up by industry. What exactly they worked on then was never clear to me; but the money was good and the facilitiies (hardware/software) being offered were quite wonderful. I don't know if the love affair continues or not. There are a few other neo-industries in comparable situations today, I believe. If you know someting about it, and have a degree that proves it, then you're hired. If I were a betting man, I would give good odds that a lot of those graduates weren't very productive for a long time. I make this judgment simply on the fact that new graduates do not know the industry to which they are trying to apply their knowledge. quote:
a college AI coding (class or course) might help coders . Only if taught by someone who has a lot of experience writing code and if it is focused on a specific application(s).
_____________________________
Steve Perfection is an elusive goal.
|