<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xml:base="http://132.161.132.157/drupal6"  xmlns:dc="http://purl.org/dc/elements/1.1/">
<channel>
 <title>Computer Science - neural networks</title>
 <link>http://132.161.132.157/drupal6/taxonomy/term/671/0</link>
 <description></description>
 <language>en</language>
<item>
 <title>Thursday Extra (September 6): &quot;Adversarial Examples; or, When Is a School Bus an Ostrich?&quot;</title>
 <link>http://132.161.132.157/drupal6/node/978</link>
 <description>&lt;p&gt;
On Thursday, September 6, John Stone will give a talk on adversarial examples, which are inputs to software applications for classification, assessment, or diagnosis that are specifically contrived to elicit incorrect or misleading results.  Many applications based on neural networks configured by machine-learning algorithms have been found to be vulnerable to such examples.  The talk will explain the nature of the vulnerability and explore possible explanations.
&lt;/p&gt;

&lt;p&gt;
At 4:00 p.m., refreshments will be served in the Computer Science Commons.  The talk, &quot;Adversarial Examples; or, When Is a School Bus an Ostrich?&quot; will begin at 4:15 p.m. in Noyce 3821.  Everyone is welcome to attend!
&lt;/p&gt;</description>
 <comments>http://132.161.132.157/drupal6/node/978#comments</comments>
 <category domain="http://132.161.132.157/drupal6/taxonomy/term/672">adversarial examples</category>
 <category domain="http://132.161.132.157/drupal6/taxonomy/term/120">machine learning</category>
 <category domain="http://132.161.132.157/drupal6/taxonomy/term/671">neural networks</category>
 <category domain="http://132.161.132.157/drupal6/taxonomy/term/42">Thursday Extras</category>
 <pubDate>Mon, 03 Sep 2018 13:14:34 +0000</pubDate>
 <dc:creator>stone</dc:creator>
 <guid isPermaLink="false">978 at http://132.161.132.157/drupal6</guid>
</item>
</channel>
</rss>
