The 9th Workshop on Workflows in Support of Large-Scale Science
in conjunction with SC 14 (New Orleans, Louisiana, Nov. 16, 2014)
http://works.cs.cardiff.ac.uk
Call For Papers
============================== ===========================
Data Intensive Workflows (a.k.a. scientific workflows) are a key technology to manage Big Data analytics in all scientific areas, exploiting capabilities of large-scale distributed and parallel computing infrastructures. Workflows enable scientists to design complex analysis that are composed of individual application components or services designed collaboratively. On large-scale computing infrastructures routinely used for e-Sciences today, workflow management systems provide both a formal description of distributed processes and an engine to enact applications composed of wealth of concurrent processes. Furthermore, workflow enactment engines often ensure data traceability by registering data processing provenance traces upon execution.
The size of the data and the scale of the data analysis flows often lead to complex and distributed data sets management. Workflow formalisms including adequate structures for Big Data sets representation and concurrent processing are needed. Besides the magnitude of data processed by the workflow components, the intermediate and resulting data needs to be annotated with provenance and other information to evaluate the quality of the data and support the repeatability of the analysis.
The process of workflow design and execution in a distributed environment can be very complex and can involve multiple stages including their textual or graphical specification, the mapping of the high-level workflow descriptions onto the available resources, as well as monitoring and debugging of the subsequent execution. Further, since computations and data access operations are performed on shared resources, there is an increased interest in managing the fair allocation and management of those resources at the workflow level.
Data-driven computations are increasingly considered to harness the Big Data challenges. Yet, scientific experiments also require the description of complex control flows. Adequate workflow descriptions are needed to support the complex workflow management process, which includes workflow design, workflow reuse, and modifications made to the workflow over time?for example modifications to the individual workflow components. Additional workflow annotations may provide guidelines and requirements for resource mapping and execution.
The Ninth Workshop on Workflows in Support of Large-Scale Science focuses on the entire workflow lifecycle including the workflow design, mapping, robust execution and the recording of provenance information. The workshop also welcomes contributions in the applications area, where the requirements on the workflow management systems can be derived. The topics of the workshop include but are not limited to:
- Big Data analytics Workflows.
- Data-driven workflow processing.
- Workflow composition, tools and languages.
- Workflow execution in distributed environments.
- Workflows on the cloud.
- Exascale computing with workflows.
- Workflow refinement tools that can manage the workflow mapping process.
- Workflow fault-tolerance and recovery techniques.
- Workflow user environments, including portals.
- Workflow applications and their requirements.
- Adaptive workflows.
- Workflow monitoring.
- Workflow optimizations.
- Performance analysis of workflows.
- Workflow debugging.
- Workflow provenance.
- Interactive workflows.
- Workflow interoperability.
Important Dates:
- Papers due July 15th, 2014
- Notifications of acceptance September 1st, 2014
- Final papers due October 1st, 2014
Program Committee Chairs:
- Johan Montagnat, CNRS, France
- Ian Taylor, Cardiff University, UK
Tentative Program Committee Members:
- Khalid Belhajjame University of Manchester
- Adam Belloum University of Amsterdam
- Ivona Brandic Vienna University of Technology
- Marian Bubak AGH Krakow & University of Amsterdam
- Ann Chervenak University of Southern California
- Ewa Deelman USC Information Sciences Institute
- Sandra Gesing University of Notre Dame
- Yolanda Gil USC Information Sciences Institute
- Tristan Glatard CNRS
- P?ter Kacsuk MTA SZTAKI
- Dimka Karastoyanova Stuttgart University
- Daniel S. Katz University of Chicago & Argonne National Laboratory
- Tamas Kiss University of Westminster
- Dagmar Krefting University of Applied Sciences Berlin
- Jarek Nabrzyski University of Notre Dame
- Maciej Malawski AGH University of Science and Technology
- Stephen McGough Newcastle University
- Cesare Pautasso University of Lugano
- Radu Prodan University of Innsbruck
- Chase Qishi Wu University of Memphis
- Omer Rana Cardiff University
- David De Roure Oxford University
- Rizos Sakellariou University of Manchester
- Gabor Terstyanszky University of Westminster
- Michael Wilde University of Chicago & Argonne National Laboratory
in conjunction with SC 14 (New Orleans, Louisiana, Nov. 16, 2014)
http://works.cs.cardiff.ac.uk
Call For Papers
==============================
Data Intensive Workflows (a.k.a. scientific workflows) are a key technology to manage Big Data analytics in all scientific areas, exploiting capabilities of large-scale distributed and parallel computing infrastructures. Workflows enable scientists to design complex analysis that are composed of individual application components or services designed collaboratively. On large-scale computing infrastructures routinely used for e-Sciences today, workflow management systems provide both a formal description of distributed processes and an engine to enact applications composed of wealth of concurrent processes. Furthermore, workflow enactment engines often ensure data traceability by registering data processing provenance traces upon execution.
The size of the data and the scale of the data analysis flows often lead to complex and distributed data sets management. Workflow formalisms including adequate structures for Big Data sets representation and concurrent processing are needed. Besides the magnitude of data processed by the workflow components, the intermediate and resulting data needs to be annotated with provenance and other information to evaluate the quality of the data and support the repeatability of the analysis.
The process of workflow design and execution in a distributed environment can be very complex and can involve multiple stages including their textual or graphical specification, the mapping of the high-level workflow descriptions onto the available resources, as well as monitoring and debugging of the subsequent execution. Further, since computations and data access operations are performed on shared resources, there is an increased interest in managing the fair allocation and management of those resources at the workflow level.
Data-driven computations are increasingly considered to harness the Big Data challenges. Yet, scientific experiments also require the description of complex control flows. Adequate workflow descriptions are needed to support the complex workflow management process, which includes workflow design, workflow reuse, and modifications made to the workflow over time?for example modifications to the individual workflow components. Additional workflow annotations may provide guidelines and requirements for resource mapping and execution.
The Ninth Workshop on Workflows in Support of Large-Scale Science focuses on the entire workflow lifecycle including the workflow design, mapping, robust execution and the recording of provenance information. The workshop also welcomes contributions in the applications area, where the requirements on the workflow management systems can be derived. The topics of the workshop include but are not limited to:
- Big Data analytics Workflows.
- Data-driven workflow processing.
- Workflow composition, tools and languages.
- Workflow execution in distributed environments.
- Workflows on the cloud.
- Exascale computing with workflows.
- Workflow refinement tools that can manage the workflow mapping process.
- Workflow fault-tolerance and recovery techniques.
- Workflow user environments, including portals.
- Workflow applications and their requirements.
- Adaptive workflows.
- Workflow monitoring.
- Workflow optimizations.
- Performance analysis of workflows.
- Workflow debugging.
- Workflow provenance.
- Interactive workflows.
- Workflow interoperability.
Important Dates:
- Papers due July 15th, 2014
- Notifications of acceptance September 1st, 2014
- Final papers due October 1st, 2014
Program Committee Chairs:
- Johan Montagnat, CNRS, France
- Ian Taylor, Cardiff University, UK
Tentative Program Committee Members:
- Khalid Belhajjame University of Manchester
- Adam Belloum University of Amsterdam
- Ivona Brandic Vienna University of Technology
- Marian Bubak AGH Krakow & University of Amsterdam
- Ann Chervenak University of Southern California
- Ewa Deelman USC Information Sciences Institute
- Sandra Gesing University of Notre Dame
- Yolanda Gil USC Information Sciences Institute
- Tristan Glatard CNRS
- P?ter Kacsuk MTA SZTAKI
- Dimka Karastoyanova Stuttgart University
- Daniel S. Katz University of Chicago & Argonne National Laboratory
- Tamas Kiss University of Westminster
- Dagmar Krefting University of Applied Sciences Berlin
- Jarek Nabrzyski University of Notre Dame
- Maciej Malawski AGH University of Science and Technology
- Stephen McGough Newcastle University
- Cesare Pautasso University of Lugano
- Radu Prodan University of Innsbruck
- Chase Qishi Wu University of Memphis
- Omer Rana Cardiff University
- David De Roure Oxford University
- Rizos Sakellariou University of Manchester
- Gabor Terstyanszky University of Westminster
- Michael Wilde University of Chicago & Argonne National Laboratory
No comments:
Post a Comment