<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: Designing the Best Solution, Not the Best Guess</title>
	<atom:link href="http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/</link>
	<description>Collaboration That Works</description>
	<lastBuildDate>Mon, 04 Mar 2013 18:36:34 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.5.1</generator>
	<item>
		<title>By: Robert &#34;Doc&#34; Hall</title>
		<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/#comment-350</link>
		<dc:creator>Robert &#34;Doc&#34; Hall</dc:creator>
		<pubDate>Mon, 15 Sep 2008 14:52:10 +0000</pubDate>
		<guid isPermaLink="false">/blog/post/High-Perf-Computing.aspx#comment-350</guid>
		<description><![CDATA[Any recent design project involving a complex integrated product that I know about has used simulation, if only here and there, rag-tag.  More comprehensive practice is necessary, as at Cummins, whose new 6.7L turbo diesel was based on round-the-clock simulation using a computing facility in India.  Components had to reinforce each other.  Parts and sub-systems could not just &quot;bolt onto the block.&quot;  Developing this design capability was more significant to Cummins than the engine, the first to be designed that way.    

That said, I wonder if we are suggesting a tool before fully defining the problem.  For example, in a prior life I worked an engineering information center.  Found that half the job was probing what requesters were really trying to do before helping them use that old rickety system.  That is, for the time, system capability was no problem; deciding what to do with it was.  Doubt if the human element has changed much, so if reducing development time and loop backs is the objective, there&#039;s more to it than simulation capability.    

Of course, simulation requires validating that both model and data are relevant to design intent, but what of possibilities and data that are not known; not included in the simulation?  I&#039;m somewhat aware of Mike Gnam and Paul Chalmer&#039;s work.  They begin to address this.  DfX is a huge shift toward integrative design.  Unknowns and data gaps are common.  If Decision Incite addresses this, I did not pick it up from their site.  Many designers have a limited concept of DfX; and relevant data for it are not easy to come by if they do.  That is, if designers are not clamoring for detailed simulation capability today, why?  

Robert W. &quot;Doc&quot; Hall
Editor-in-Chief, Target
Association for Manufacturing Excellence


]]></description>
		<content:encoded><![CDATA[<p>Any recent design project involving a complex integrated product that I know about has used simulation, if only here and there, rag-tag.  More comprehensive practice is necessary, as at Cummins, whose new 6.7L turbo diesel was based on round-the-clock simulation using a computing facility in India.  Components had to reinforce each other.  Parts and sub-systems could not just &quot;bolt onto the block.&quot;  Developing this design capability was more significant to Cummins than the engine, the first to be designed that way.    </p>
<p>That said, I wonder if we are suggesting a tool before fully defining the problem.  For example, in a prior life I worked an engineering information center.  Found that half the job was probing what requesters were really trying to do before helping them use that old rickety system.  That is, for the time, system capability was no problem; deciding what to do with it was.  Doubt if the human element has changed much, so if reducing development time and loop backs is the objective, there&#8217;s more to it than simulation capability.    </p>
<p>Of course, simulation requires validating that both model and data are relevant to design intent, but what of possibilities and data that are not known; not included in the simulation?  I&#8217;m somewhat aware of Mike Gnam and Paul Chalmer&#8217;s work.  They begin to address this.  DfX is a huge shift toward integrative design.  Unknowns and data gaps are common.  If Decision Incite addresses this, I did not pick it up from their site.  Many designers have a limited concept of DfX; and relevant data for it are not easy to come by if they do.  That is, if designers are not clamoring for detailed simulation capability today, why?  </p>
<p>Robert W. &quot;Doc&quot; Hall<br />
Editor-in-Chief, Target<br />
Association for Manufacturing Excellence</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Gene Allen</title>
		<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/#comment-349</link>
		<dc:creator>Gene Allen</dc:creator>
		<pubDate>Wed, 10 Sep 2008 20:14:40 +0000</pubDate>
		<guid isPermaLink="false">/blog/post/High-Perf-Computing.aspx#comment-349</guid>
		<description><![CDATA[It is very encouraging to see the comments to this post.  There are a couple items that are important to clarify about the High Performance SIMULATION capability being established.  The focus is on quickly getting reliable information by taking advantage of:

- COMMODITY COMPUTING meaning that the cost per CPU-Hr is now measured in cents, from 10 cents to 60 cents per CPU-Hr.  There is no need to purchase expensive hardware as it is being offered on-demand from companies ranging from Amazon to IBM.

-  MINIMAL ASSUMPTIONS as the Monte Carlo process used in Simuation Supported Decision Making is independent of the number of variables in the problem and is mathmatically simple.   A friend who is an MIT mathmatician relayed that &quot;the Monte Carlo process is NOT mathmatically elegent (meaning complex), but it just gives you the right answers.&quot;  This changes the problem definition from historically making simplifying assumptions to solve the problem, to incorporating as many variables as possible (minimizing assumptions) and letting the computer analysis sort out what is important (vice assuming). 

The key point is that engineering analysis has historically been deterministic, in which each variable has one value.  We are now promoting stochastic simulation, in which each variable has a range of values (as in what really exists).  Stochastic (or Monte Carlo) simulation enables significantly more information from a model, such as identification of relationships between variables, and outliers (combinations of variables that generate non-intuitive results.

The barrier to using stochasic simulation in the past has been the computational cost and time needed to run hundreds or thousands of analysis runs.  Advances in computers have enabled stochastic simulation with CPU costs of pennies per hour.  With this capability every engineering analysis should be done stochastically.  I presently demo a simple stress calculation on an I beam in Excel on my laptop to show how much more information can be gained from a model using stochastic simulation.  Car companies have been conducting stochastic simulation of car crash for years.  The process is independent of problem size.  Every analysis should be run stochastically to take variability into account and get orders of magnitude more information - because you can today for minimal extra cost with commodity computing.

Another key point raised by Jerome Aiello is with the need for this to be used by people who have real world experience ad understanding in what they are doing.  A poor model will give poor results.  (A side benefit that we have found with stochastic simulation is that it can often detect poor analysis models as the simulation will not run on some combinations of variables as a result of poor modeling practices.)   

Simulation Supported Decision Making is a way to take advantage of commodity computing to provide a way to learn from &quot;virtual&quot; experience complementing out real experience. 

Feel free to call or e-mail if anyone has further questions or would like more informaiton on SSDM.  I can be contacted at geneallen@decisionincite.com or at 703-582-5554.]]></description>
		<content:encoded><![CDATA[<p>It is very encouraging to see the comments to this post.  There are a couple items that are important to clarify about the High Performance SIMULATION capability being established.  The focus is on quickly getting reliable information by taking advantage of:</p>
<p>- COMMODITY COMPUTING meaning that the cost per CPU-Hr is now measured in cents, from 10 cents to 60 cents per CPU-Hr.  There is no need to purchase expensive hardware as it is being offered on-demand from companies ranging from Amazon to IBM.</p>
<p>-  MINIMAL ASSUMPTIONS as the Monte Carlo process used in Simuation Supported Decision Making is independent of the number of variables in the problem and is mathmatically simple.   A friend who is an MIT mathmatician relayed that &quot;the Monte Carlo process is NOT mathmatically elegent (meaning complex), but it just gives you the right answers.&quot;  This changes the problem definition from historically making simplifying assumptions to solve the problem, to incorporating as many variables as possible (minimizing assumptions) and letting the computer analysis sort out what is important (vice assuming). </p>
<p>The key point is that engineering analysis has historically been deterministic, in which each variable has one value.  We are now promoting stochastic simulation, in which each variable has a range of values (as in what really exists).  Stochastic (or Monte Carlo) simulation enables significantly more information from a model, such as identification of relationships between variables, and outliers (combinations of variables that generate non-intuitive results.</p>
<p>The barrier to using stochasic simulation in the past has been the computational cost and time needed to run hundreds or thousands of analysis runs.  Advances in computers have enabled stochastic simulation with CPU costs of pennies per hour.  With this capability every engineering analysis should be done stochastically.  I presently demo a simple stress calculation on an I beam in Excel on my laptop to show how much more information can be gained from a model using stochastic simulation.  Car companies have been conducting stochastic simulation of car crash for years.  The process is independent of problem size.  Every analysis should be run stochastically to take variability into account and get orders of magnitude more information &#8211; because you can today for minimal extra cost with commodity computing.</p>
<p>Another key point raised by Jerome Aiello is with the need for this to be used by people who have real world experience ad understanding in what they are doing.  A poor model will give poor results.  (A side benefit that we have found with stochastic simulation is that it can often detect poor analysis models as the simulation will not run on some combinations of variables as a result of poor modeling practices.)   </p>
<p>Simulation Supported Decision Making is a way to take advantage of commodity computing to provide a way to learn from &quot;virtual&quot; experience complementing out real experience. </p>
<p>Feel free to call or e-mail if anyone has further questions or would like more informaiton on SSDM.  I can be contacted at <a href="mailto:geneallen@decisionincite.com">geneallen@decisionincite.com</a> or at 703-582-5554.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Jerome A. Aiello</title>
		<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/#comment-348</link>
		<dc:creator>Jerome A. Aiello</dc:creator>
		<pubDate>Wed, 10 Sep 2008 17:32:25 +0000</pubDate>
		<guid isPermaLink="false">/blog/post/High-Perf-Computing.aspx#comment-348</guid>
		<description><![CDATA[In my last post, I may not have explained myself sufficiently.

I SHOULD have said: &quot;What you need is Shop Rats with an Associates Degree IN SIMULATION ENGINEERING&quot;

I do not want to disparage Computer Science Majors, because they are talented and learned individuals. But what I typically saw, is that those were the types of graduates they were hiring to learn Simulation Engineering, and that was TOTALLY wrong !

What Simulation Engineers need the MOST is HANDS-ON knowledge of THE PROCESSES THEY ARE TRYING TO SIMULATE, NOT how computers work.

THAT&#039;S why Shop Rats are better suited to be Simulation Engineers than 
Computer Science Majors.

Computer Science Majors usually attempt things that are not practical in the real world, such as holding a car hood with a suction cups end-of-arm fixture, and moving it at 2000 MM/sec, which works just fine in the computer world, but fails miserably in the real world, where the AIR RESISTANCE blows the hood off the suction cups before the destination is reached. 

A Shop Rat who has, for example, done Robot Programming on the Shop Floor, won&#039;t make that mistake.]]></description>
		<content:encoded><![CDATA[<p>In my last post, I may not have explained myself sufficiently.</p>
<p>I SHOULD have said: &quot;What you need is Shop Rats with an Associates Degree IN SIMULATION ENGINEERING&quot;</p>
<p>I do not want to disparage Computer Science Majors, because they are talented and learned individuals. But what I typically saw, is that those were the types of graduates they were hiring to learn Simulation Engineering, and that was TOTALLY wrong !</p>
<p>What Simulation Engineers need the MOST is HANDS-ON knowledge of THE PROCESSES THEY ARE TRYING TO SIMULATE, NOT how computers work.</p>
<p>THAT&#8217;S why Shop Rats are better suited to be Simulation Engineers than<br />
Computer Science Majors.</p>
<p>Computer Science Majors usually attempt things that are not practical in the real world, such as holding a car hood with a suction cups end-of-arm fixture, and moving it at 2000 MM/sec, which works just fine in the computer world, but fails miserably in the real world, where the AIR RESISTANCE blows the hood off the suction cups before the destination is reached. </p>
<p>A Shop Rat who has, for example, done Robot Programming on the Shop Floor, won&#8217;t make that mistake.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Phil Callihan</title>
		<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/#comment-347</link>
		<dc:creator>Phil Callihan</dc:creator>
		<pubDate>Wed, 10 Sep 2008 09:33:43 +0000</pubDate>
		<guid isPermaLink="false">/blog/post/High-Perf-Computing.aspx#comment-347</guid>
		<description><![CDATA[Here is a link to a recent Wired article that gives some background on the theory behind what NCMS is attempting.

[quote ]...The Petabyte Age is different because more is different. Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies.

At the petabyte scale, information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics. It calls for an entirely different approach, one that requires us to lose the tether of data as something that can be visualized in its totality. It forces us to view data mathematically first and establish a context for it later. For instance, Google conquered the advertising world with nothing more than applied mathematics. It didn&#039;t pretend to know anything about the culture and conventions of advertising — it just assumed that better data, with better analytical tools, would win the day. And Google was right...[/quote ]

http://www.wired.com/science/discoveries/magazine/16-07/pb_theory]]></description>
		<content:encoded><![CDATA[<p>Here is a link to a recent Wired article that gives some background on the theory behind what NCMS is attempting.</p>
<p>[quote ]&#8230;The Petabyte Age is different because more is different. Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies.</p>
<p>At the petabyte scale, information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics. It calls for an entirely different approach, one that requires us to lose the tether of data as something that can be visualized in its totality. It forces us to view data mathematically first and establish a context for it later. For instance, Google conquered the advertising world with nothing more than applied mathematics. It didn&#8217;t pretend to know anything about the culture and conventions of advertising — it just assumed that better data, with better analytical tools, would win the day. And Google was right&#8230;[/quote ]</p>
<p><a href="http://www.wired.com/science/discoveries/magazine/16-07/pb_theory" rel="nofollow">http://www.wired.com/science/discoveries/magazine/16-07/pb_theory</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Jerome A. Aiello</title>
		<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/#comment-346</link>
		<dc:creator>Jerome A. Aiello</dc:creator>
		<pubDate>Wed, 10 Sep 2008 01:41:27 +0000</pubDate>
		<guid isPermaLink="false">/blog/post/High-Perf-Computing.aspx#comment-346</guid>
		<description><![CDATA[As a Simulation Engineering Instructor, I can attest to the fact that Physics-Based Simulation has already proven to be a valuable tool in manufacturing and other industries, literally saving hundreds of millions of dollars in mistakes by catching those mistakes in the design phase.

Automotive Manufacturing now does nearly all Robotic programming using Off-Line Simulation, NIST uses Simulation, Russian Scientists are using it to clean up Chernobyl, the navy uses it to help design atomic submarines, and there are several Sim Software products that will run on laptops with good results. So extremely expensive mega-computers are not needed in most cases.

The major caveat involves GIGO. 

Well-Trained and Experienced operators are required to run the software, and they need real-world experience in the things they are trying to simulate.

Freshly-graduated Computer Science majors who have never worked in the environments simulated, simply will not do.

DARPA found this to be true, and tried to get operators trained in Community Colleges using a project called &quot;Conduit&quot; in the late 90&#039;s. I was the instructor on this project,  but it failed due to poor management.

You don&#039;t need PhD&#039;s.  You need Shop Rats with Associates Degrees !]]></description>
		<content:encoded><![CDATA[<p>As a Simulation Engineering Instructor, I can attest to the fact that Physics-Based Simulation has already proven to be a valuable tool in manufacturing and other industries, literally saving hundreds of millions of dollars in mistakes by catching those mistakes in the design phase.</p>
<p>Automotive Manufacturing now does nearly all Robotic programming using Off-Line Simulation, NIST uses Simulation, Russian Scientists are using it to clean up Chernobyl, the navy uses it to help design atomic submarines, and there are several Sim Software products that will run on laptops with good results. So extremely expensive mega-computers are not needed in most cases.</p>
<p>The major caveat involves GIGO. </p>
<p>Well-Trained and Experienced operators are required to run the software, and they need real-world experience in the things they are trying to simulate.</p>
<p>Freshly-graduated Computer Science majors who have never worked in the environments simulated, simply will not do.</p>
<p>DARPA found this to be true, and tried to get operators trained in Community Colleges using a project called &quot;Conduit&quot; in the late 90&#8242;s. I was the instructor on this project,  but it failed due to poor management.</p>
<p>You don&#8217;t need PhD&#8217;s.  You need Shop Rats with Associates Degrees !</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Ward Elwood</title>
		<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/#comment-345</link>
		<dc:creator>Ward Elwood</dc:creator>
		<pubDate>Tue, 09 Sep 2008 16:50:19 +0000</pubDate>
		<guid isPermaLink="false">/blog/post/High-Perf-Computing.aspx#comment-345</guid>
		<description><![CDATA[Through my work developing and marketing new consumer packaged goods (CPG), I recognize the importance of getting finding issues early.  Increasing pressure to decrase lead times and development costs allows less time to comb through every possible option.  I can see how this sort of solution could support a hypothesis-based development approach by helping developers and decision-makers identify and focus on the more critical development areas.

Additionally, I could see business applications beyond product development in general business development.  My team has recently gone through an exercise in which we selected key target geographies based on a variety of criteria including sales, growth, market size, margin, etc.  We were limited by the number of variables we could handle.  This solution would seem to eliminate that constraint.

While I see the potential of this sort of solution, I have trouble seeing how WELL it might translate to business solution and/or CPG product development.  Development of a complex model may prove more effort than the solution would ultimately be worth given the short cycle time and lower overall price (i.e. lower risk) in this market.  A super-computer may be more power than I might need.

Any thoughts on simpler applications of this solution?  Could you provide an example of this tool (or one like it) being used (i.e. sample variables, sample inputs, etc) so I could better translate this into my potential needs?

Thanks,
Ward Elwood
Sr. Brand Manager, Developing &amp; Emerging Markets 
Kimberly-Clark]]></description>
		<content:encoded><![CDATA[<p>Through my work developing and marketing new consumer packaged goods (CPG), I recognize the importance of getting finding issues early.  Increasing pressure to decrase lead times and development costs allows less time to comb through every possible option.  I can see how this sort of solution could support a hypothesis-based development approach by helping developers and decision-makers identify and focus on the more critical development areas.</p>
<p>Additionally, I could see business applications beyond product development in general business development.  My team has recently gone through an exercise in which we selected key target geographies based on a variety of criteria including sales, growth, market size, margin, etc.  We were limited by the number of variables we could handle.  This solution would seem to eliminate that constraint.</p>
<p>While I see the potential of this sort of solution, I have trouble seeing how WELL it might translate to business solution and/or CPG product development.  Development of a complex model may prove more effort than the solution would ultimately be worth given the short cycle time and lower overall price (i.e. lower risk) in this market.  A super-computer may be more power than I might need.</p>
<p>Any thoughts on simpler applications of this solution?  Could you provide an example of this tool (or one like it) being used (i.e. sample variables, sample inputs, etc) so I could better translate this into my potential needs?</p>
<p>Thanks,<br />
Ward Elwood<br />
Sr. Brand Manager, Developing &amp; Emerging Markets<br />
Kimberly-Clark</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: John Shore</title>
		<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/#comment-344</link>
		<dc:creator>John Shore</dc:creator>
		<pubDate>Sun, 07 Sep 2008 09:20:12 +0000</pubDate>
		<guid isPermaLink="false">/blog/post/High-Perf-Computing.aspx#comment-344</guid>
		<description><![CDATA[SSDM is something that has been around for awhile.  Running mathematical models to find outliers is really not a new concept. The math is very complex though which is the reason most of us have had to make lots of assumptions to make the equations manageable to solve. It takes a lot of computing power to make this work.  In the past only large corporations had the budgets to buy the equipment to run more complex models.  This is the reason why you typically saw this type of analysis only run by large automotive, aerospace, and defense companies.  I think the time has finally come where you can actually run these models without the need of making hundreds of assumptions.  The cost of computing power is really cheap now and the number of calculations per second has dramatically increased in the last decade.

I think that this type of software will now put the SSDM analysis in the hands of the middle tier manufacturers.  This is a good thing since it should improve products and bring about more competition in areas that typically have been the domain of larger companies.  It also should bode well for the consumer who should see the effects of this type of product quality improvement in cheaper and better products.

This type of product can also benefit in other areas where it is not used as often if at all like building materials, replacement parts, safety equipment, renewable energy equipment, and healthcare.  

The smaller manufacturers may have to ramp up for this type of software by dedicating some engineering resources to figure out the variables, assumptions, etc. to create a realistic model.  This is the hard part.  You still have to know your product and how to successfully model all the constraints.
However, this probably nothing more than hiring a couple of PhDs or consultants to help you these companies get rolling.  The end result should be interesting. 
]]></description>
		<content:encoded><![CDATA[<p>SSDM is something that has been around for awhile.  Running mathematical models to find outliers is really not a new concept. The math is very complex though which is the reason most of us have had to make lots of assumptions to make the equations manageable to solve. It takes a lot of computing power to make this work.  In the past only large corporations had the budgets to buy the equipment to run more complex models.  This is the reason why you typically saw this type of analysis only run by large automotive, aerospace, and defense companies.  I think the time has finally come where you can actually run these models without the need of making hundreds of assumptions.  The cost of computing power is really cheap now and the number of calculations per second has dramatically increased in the last decade.</p>
<p>I think that this type of software will now put the SSDM analysis in the hands of the middle tier manufacturers.  This is a good thing since it should improve products and bring about more competition in areas that typically have been the domain of larger companies.  It also should bode well for the consumer who should see the effects of this type of product quality improvement in cheaper and better products.</p>
<p>This type of product can also benefit in other areas where it is not used as often if at all like building materials, replacement parts, safety equipment, renewable energy equipment, and healthcare.  </p>
<p>The smaller manufacturers may have to ramp up for this type of software by dedicating some engineering resources to figure out the variables, assumptions, etc. to create a realistic model.  This is the hard part.  You still have to know your product and how to successfully model all the constraints.<br />
However, this probably nothing more than hiring a couple of PhDs or consultants to help you these companies get rolling.  The end result should be interesting. </p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Jack Ring</title>
		<link>http://www.ncms.org/index.php/2008/09/02/designing-the-best-solution-not-the-best-guess/#comment-343</link>
		<dc:creator>Jack Ring</dc:creator>
		<pubDate>Fri, 05 Sep 2008 18:40:51 +0000</pubDate>
		<guid isPermaLink="false">/blog/post/High-Perf-Computing.aspx#comment-343</guid>
		<description><![CDATA[I suggest that the NCMS raise their sights from High Performance Computing to High Performance Information and Choice Making. 
 
High Performance Information and Choice Making focuses on the benefit for NCMS clients, rather than on a presumption of appropriate technology. 
 
Simulation reveals the anticipated behavior of a system model. Anticipated behavior guides systems design choice-making which becomes ever more important as the system gets bigger in extent, variety and ambiguity. However, the need for High Performance (meaning high cost relative to other platforms) stems from the notion that computing is the better platform for anticipating system behavior. NCMS should focus on the Ends --- Adequate, Accurate and Timely foresight about requisite system behavior --- not the Means --- High Performance Computing.
 
In fairness other kinds of platforms have not been available. However, with the advent of new processing architectures, typified by Patent Reg. # 7392229.B.2, it is now reasonable to think of a $100 chip doing the work of 3,400 microprocessors --- in microseconds.
 
If you want to look at the question of alternatives to high performance (high cost) computers, a good overview may be seen at &quot;The Von Neumann Syndrome&quot;, R. Hartenstein, downloaded January, 14, 2008 from
http://www.fpl.uni-kl.de/staff/hartenstein/Hartenstein-Delft-Sep2007.pdf 
 
If you want to know more about the significance of the General Purpose Set Theoretic Processor described in Patent Reg. # 7392229.B.2 I will be happy to share a white paper regarding a Systems Viability and Verification Capabilty. Please advise.
 
Hope this helps move things along,

Jack Ring
Co-founder, Kennen Technologies LLC
Fellow, International Council on Systems Engineering
]]></description>
		<content:encoded><![CDATA[<p>I suggest that the NCMS raise their sights from High Performance Computing to High Performance Information and Choice Making. </p>
<p>High Performance Information and Choice Making focuses on the benefit for NCMS clients, rather than on a presumption of appropriate technology. </p>
<p>Simulation reveals the anticipated behavior of a system model. Anticipated behavior guides systems design choice-making which becomes ever more important as the system gets bigger in extent, variety and ambiguity. However, the need for High Performance (meaning high cost relative to other platforms) stems from the notion that computing is the better platform for anticipating system behavior. NCMS should focus on the Ends &#8212; Adequate, Accurate and Timely foresight about requisite system behavior &#8212; not the Means &#8212; High Performance Computing.</p>
<p>In fairness other kinds of platforms have not been available. However, with the advent of new processing architectures, typified by Patent Reg. # 7392229.B.2, it is now reasonable to think of a $100 chip doing the work of 3,400 microprocessors &#8212; in microseconds.</p>
<p>If you want to look at the question of alternatives to high performance (high cost) computers, a good overview may be seen at &quot;The Von Neumann Syndrome&quot;, R. Hartenstein, downloaded January, 14, 2008 from<br />
<a href="http://www.fpl.uni-kl.de/staff/hartenstein/Hartenstein-Delft-Sep2007.pdf" rel="nofollow">http://www.fpl.uni-kl.de/staff/hartenstein/Hartenstein-Delft-Sep2007.pdf</a> </p>
<p>If you want to know more about the significance of the General Purpose Set Theoretic Processor described in Patent Reg. # 7392229.B.2 I will be happy to share a white paper regarding a Systems Viability and Verification Capabilty. Please advise.</p>
<p>Hope this helps move things along,</p>
<p>Jack Ring<br />
Co-founder, Kennen Technologies LLC<br />
Fellow, International Council on Systems Engineering</p>
]]></content:encoded>
	</item>
</channel>
</rss>