From: David Cohn Date: Mon, 11 Sep 95 18:06:01 EDT Subject: NIPS*95 Registration Info Available CONFERENCE ANNOUNCEMENT Neural Information Processing Systems Natural and Synthetic Monday, Nov. 27 - Saturday, Dec. 2, 1995 Denver, Colorado http://www.cs.cmu.edu/Web/Groups/NIPS/NIPS.html This is the ninth meeting of an interdisciplinary conference which brings together neuroscientists, engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in all aspects of neural processing and computation. The confer- ence will include invited talks, and oral and poster presenta- tions of refereed papers. There will be no parallel sessions. There will also be one day of tutorial presentations (Nov. 27) preceding the regular session, and two days of focused workshops will follow at a nearby ski area (Dec. 1-2). Major conference topics include: Neuroscience, Theory, Implemen- tations, Applications, Algorithms & Architectures, Visual Pro- cessing, Speech/Handwriting/Signal Processing, Cognitive Science & AI, Control, Navigation and Planning. Detailed information and registration materials are available electronically at http://www.cs.cmu.edu/Web/Groups/NIPS/NIPS.html ftp://psyche.mit.edu/pub/NIPS95/ Students who require financial support to attend the conference are urged to retrieve a copy of the registration brochure as soon as possible in order to meet the aid application deadline. Mail general inquiries/requests for registration material to: NIPS*95 Registration Dept. of Mathematical and Computer Sciences Colorado School of Mines Golden, CO 80401 USA FAX: (303) 273-3875 e-mail: nips95@mines.colorado.edu From: Rich Caruana Date: Fri, 15 Sep 95 09:26:38 -0400 Subject: NIPS*95 Workshop on Transfer: Call for Participation *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* *-* POST-NIPS*95 WORKSHOP *-* *-* December 1-2, 1995 *-* *-* Vail, Colorado *-* *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* *-* CALL FOR PARTICIPATION *-* *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* TITLE: "Learning to Learn: Knowledge Consolidation and Transfer in Inductive Systems" ORGANIZERS: Jon Baxter, Rich Caruana, Tom Mitchell, Lori Pratt, Danny Silver, Sebastian Thrun. INVITED TALKS BY: Leo Breiman (Stanford, undecided) Tom Mitchell (CMU) Tomaso Poggio (MIT) Noel Sharkey (Sheffield) Jude Shavlik (Wisconsin) WEB PAGES (for more information): Our Workshop: http://www.cs.cmu.edu/afs/cs/usr/caruana/pub/transfer.html NIPS*95 Info: http://www.cs.cmu.edu/afs/cs/project/cnbc/nips/NIPS.html WORKSHOP DESCRIPTION: The power of tabula rasa learning is limited. Because of this, interest is increasing in methods that capitalize on previously acquired domain knowledge. Examples of these methods include: o using symbolic domain theories to bias connectionist networks o using unsupervised learning on a large corpus of unlabelled data to learn features useful for subsequent supervised learning on a smaller labelled corpus o using models previously learned for other problems as a bias when learning new, but related, problems o using extra outputs on a connectionist network to bias the hidden layer representation towards more predictive features There are many different approaches: hints, knowledge-based artificial neural nets (KBANN), explanation-based neural nets (EBNN), multitask learning (MTL), knowledge consolidation, etc. What they all have in common is the attempt to transfer knowledge from other sources to benefit the current inductive task. The goal of this workshop is to provide an opportunity for researchers and practitioners to discuss problems and progress in knowledge transfer in learning. We hope to identify research directions, debate different theories and approaches, discover unifying principles, and begin to start answering questions like: o when will transfer help -- or hinder? o what should be transferred? o how should it be transferred? o what are the benefits? o in what domains is transfer most useful? SUBMISSIONS: We solicit presentations from anyone working in (or near): o Sequential/incremental, compositional (learning by parts), and parallel learning o Task knowledge transfer (symbolic-neural, neural-neural) o Adaptation of learning algorithms based on prior learning o Learning domain-specific inductive bias o Combining predictions made for related tasks from one domain o Combining supervised learning (where the goal is to learn one feature from the other features) with unsupervised learning (where the goal is to learn every feature from all the other features) o Combining symbolic and connectionist methods via transfer o Fundamental problems/issues in learning to learn o Theoretical models of learning to learn o Cognitive models of, or evidence for, transfer in learning Please send a short (one page or less) description of what you want to present to one of the co-chairs below by Oct 15. Email is preferred. We'll select from the submissions and publish a workshop schedule by Nov 1. Preference will be given to submissions that are likely to generate debate and that go beyond summarizing prior published work by raising important issues or suggesting directions for future work. Suggestions for moderator or panel-led discussions (e.g., sequential vs. parallel transfer) are also encouraged. We plan to run the workshop as a workshop, not as a mini conference, so be daring! We look forward to your submission. Rich Caruana Daniel L. Silver School of Computer Science Department of Computer Science Carnegie Mellon University Middlesex College 5000 Forbes Avenue University of Western Ontario Pittsburgh, PA 15213, USA London, Ontario, Canada N6A 3K7 email: caruana@cs.cmu.edu email: dsilver@csd.uwo.ca ph: (412) 268-3043 ph: (519) 473-6168 fax: (412) 268-5576 fax: (519) 661-3515