Please note the new deadlines for paper submission are June 26 (abstract
deadline) and July 3 (full paper deadline).
We apologize if you receive multiple copies of this call for papers. Please
distribute the call for papers and call for benchmarks to your colleagues
who may be interested in submitting papers or benchmarks.
For further information and for paper/benchmark submission instructions,
please refer to http://www.iiswc.org/iiswc2006/
Yan Solihin
****************************************************************************
CALL FOR PAPERS
The Annual IEEE International Symposium on Workload Characterization
IISWC 2006
Sponsored by IEEE Computer Society and the Technical Committee on Computer Architecture
October 25-28, 2006
Hilton Hotel, San Jose, California, USA
(Immediately following ASPLOS XII at same venue)
****************************************************************************
This symposium, in its second year, has its roots in the former Workshop on
Workload Characterization (WWC). It is dedicated to the understanding and
characterization of workloads which run on all types of computer systems.
New applications and programming paradigms continue to emerge as the use of
computers becomes more widespread and more sophisticated. Improving process
and communication technology, innovations in microarchitecture, compilers,
and virtual machine technology are also changing the nature of problems that
are being solved by computing systems. Whether they are PDAs at the low end
or massively parallel systems at the high end, the design of tomorrow’s
computing machines can be significantly improved through the knowledge and
ability to simulate the workload expected to run on them.
Papers are solicited in all areas related to characterization of computing
system workload. Topics of interest to participants in the symposium include
(but are not limited to):
Characterization of applications in areas like
o Search engines, E-commerce, Web server, Database
o Games, console and online
o Embedded, Mobile, Multimedia
o Life Sciences, Bio-Informatics, Scientific computing
oSecurity, Reliability, Biometrics
Characterization of middleware and library behavior
o Websphere, .NET, Java VM, CLI
o Graphics libraries, scientific libraries
Characterization of system behavior, including
o Operating system and hypervisor effects
o Effects due to virtualization and dynamic optimization
Implications of workload in design issues, such as
o Microarchitecture, memory hierarchy, I/O
o Power management, reliability
o Instruction set architecture
Benchmark creation issues, including
o Multithreaded benchmarks
o Profiling, trace collection, synthetic traces
o Validation of benchmarks
Abstract modeling of program behavior
**** Important Dates ****
Abstract Submission: June 26, 2006
Final Paper Submission: July 3, 2006
Acceptance Notified: August 25, 2006
Final Manuscript Submission: September 8, 2006
***************************************************************************
CALL FOR IISWC BENCHMARKS
IISWC is building a set of benchmarks to distribute to researchers. You are
invited to participate in the creation of this benchmark set. In addition to
the normal paper submissions, IISWC will accept "benchmark submissions." These
comprise C, C++ or Java code, inputs to the code, and an associated six-page
paper. The goal of the paper should be to explain the benchmark, what it does,
and why it is relevant to a particular user community. Benchmark authors
should be willing to allow distribution of source and input sets. Code must be
open source consistent with the GNU general public license, see
http://www.gnu.org/licenses.
The criteria that will be used to judge benchmarks include:
- significance of the benchmark to a user community,
- the ease of use,
- the quality and clarify of the benchmark writeup, and
- the portability of the submitted code.
Successful submissions will be included in the IISWC benchmark set and their
associated descriptive short papers published in the IISWC proceedings.
**** Important Dates ****
Paper and Benchmark Submission: July 9, 2006
Notification: August 25, 2006
Camera Ready Paper: September 8, 2006
Submissions will be in the form of a webpage that will include documentation
on the benchmark suite, as well as source code for the benchmarks and
benchmark inputs.
Send all submissions to David Kaeli at kaeli@ece.neu.edu.
**** Organizing committee ****
General Chair
D. N. (Jay) Jayasimha, Intel Corporation
Program Chair
Ravi Nair, IBM Watson Research Center
Program Committee
Doug Burger, University of Texas at Austin
Trey Cain, IBM
Pradeep Dubey, Intel
Lieven Eeckhout, Ghent University
Antonio Gonzalez, Intel and UPC
Byeong Kil Lee, Texas Instruments
Kevin Lepak, AMD
Chuck Moore, AMD
Ram Rajamony, IBM
Steve Reinhardt, Reservoir Labs and University of Michigan
Eric Rotenberg, North Carolina State University
Tim Sherwood, University of California, Santa Barbara
Yan Solihin, North Carolina State University
Lawrence Spracklen, Sun Microsystems
Eric Tune, Google
Dee Weikle, University of Virginia
Benchmark Chair
David Kaeli, Northeastern University
Steering Committee
Pradip Bose, IBM Research
Tom Conte, NC State University
Lieven Eeckhout, Ghent University
Lizy John, University of Texas at Austin
David Kaeli, Northeastern University
David Lilja, University of Minnesota
Ann Marie Maynard, IBM
John Shen, Intel
Received on Wed Jun 21 19:27:01 2006
This archive was generated by hypermail 2.1.8 : Mon Dec 08 2008 - 11:59:49 EST