IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0123811.html
   My bibliography  Save this article

MEVA - An Interactive Visualization Application for Validation of Multifaceted Meteorological Data with Multiple 3D Devices

Author

Listed:
  • Carolin Helbig
  • Lars Bilke
  • Hans-Stefan Bauer
  • Michael Böttinger
  • Olaf Kolditz

Abstract

Background: To achieve more realistic simulations, meteorologists develop and use models with increasing spatial and temporal resolution. The analyzing, comparing, and visualizing of resulting simulations becomes more and more challenging due to the growing amounts and multifaceted character of the data. Various data sources, numerous variables and multiple simulations lead to a complex database. Although a variety of software exists suited for the visualization of meteorological data, none of them fulfills all of the typical domain-specific requirements: support for quasi-standard data formats and different grid types, standard visualization techniques for scalar and vector data, visualization of the context (e.g., topography) and other static data, support for multiple presentation devices used in modern sciences (e.g., virtual reality), a user-friendly interface, and suitability for cooperative work. Methods and Results: Instead of attempting to develop yet another new visualization system to fulfill all possible needs in this application domain, our approach is to provide a flexible workflow that combines different existing state-of-the-art visualization software components in order to hide the complexity of 3D data visualization tools from the end user. To complete the workflow and to enable the domain scientists to interactively visualize their data without advanced skills in 3D visualization systems, we developed a lightweight custom visualization application (MEVA - multifaceted environmental data visualization application) that supports the most relevant visualization and interaction techniques and can be easily deployed. Specifically, our workflow combines a variety of different data abstraction methods provided by a state-of-the-art 3D visualization application with the interaction and presentation features of a computer-games engine. Our customized application includes solutions for the analysis of multirun data, specifically with respect to data uncertainty and differences between simulation runs. In an iterative development process, our easy-to-use application was developed in close cooperation with meteorologists and visualization experts. The usability of the application has been validated with user tests. We report on how this application supports the users to prove and disprove existing hypotheses and discover new insights. In addition, the application has been used at public events to communicate research results.

Suggested Citation

  • Carolin Helbig & Lars Bilke & Hans-Stefan Bauer & Michael Böttinger & Olaf Kolditz, 2015. "MEVA - An Interactive Visualization Application for Validation of Multifaceted Meteorological Data with Multiple 3D Devices," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-24, April.
  • Handle: RePEc:plo:pone00:0123811
    DOI: 10.1371/journal.pone.0123811
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0123811
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0123811&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0123811?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Seth Cooper & Firas Khatib & Adrien Treuille & Janos Barbero & Jeehyung Lee & Michael Beenen & Andrew Leaver-Fay & David Baker & Zoran Popović & Foldit players, 2010. "Predicting protein structures with a multiplayer online game," Nature, Nature, vol. 466(7307), pages 756-760, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Christoph Safferling & Aaron Lowen, 2011. "Economics in the Kingdom of Loathing: Analysis of Virtual Market Data," Working Paper Series of the Department of Economics, University of Konstanz 2011-30, Department of Economics, University of Konstanz.
    2. Prpić, John & Shukla, Prashant P. & Kietzmann, Jan H. & McCarthy, Ian P., 2015. "How to work a crowd: Developing crowd capital through crowdsourcing," Business Horizons, Elsevier, vol. 58(1), pages 77-85.
    3. Kovacs, Attila, 2018. "Gender Differences in Equity Crowdfunding," OSF Preprints 5pcmb, Center for Open Science.
    4. Kovacs, Attila, 2018. "Gender Differences in Equity Crowdfunding," OSF Preprints 5pcmb_v1, Center for Open Science.
    5. Naihui Zhou & Zachary D Siegel & Scott Zarecor & Nigel Lee & Darwin A Campbell & Carson M Andorf & Dan Nettleton & Carolyn J Lawrence-Dill & Baskar Ganapathysubramanian & Jonathan W Kelly & Iddo Fried, 2018. "Crowdsourcing image analysis for plant phenomics to generate ground truth data for machine learning," PLOS Computational Biology, Public Library of Science, vol. 14(7), pages 1-16, July.
    6. P Douglas Renfrew & Gabrielle Campbell & Charlie E M Strauss & Richard Bonneau, 2011. "The 2010 Rosetta Developers Meeting: Macromolecular Prediction and Design Meets Reproducible Publishing," PLOS ONE, Public Library of Science, vol. 6(8), pages 1-5, August.
    7. Spartaco Albertarelli & Piero Fraternali & Sergio Herrera & Mark Melenhorst & Jasminko Novak & Chiara Pasini & Andrea-Emilio Rizzoli & Cristina Rottondi, 2018. "A Survey on the Design of Gamified Systems for Energy and Water Sustainability," Games, MDPI, vol. 9(3), pages 1-34, June.
    8. Robert Swain & Alex Berger & Josh Bongard & Paul Hines, 2015. "Participation and Contribution in Crowdsourced Surveys," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-21, April.
    9. Franzoni, Chiara & Sauermann, Henry, 2014. "Crowd science: The organization of scientific research in open collaborative projects," Research Policy, Elsevier, vol. 43(1), pages 1-20.
    10. Sam Mavandadi & Stoyan Dimitrov & Steve Feng & Frank Yu & Uzair Sikora & Oguzhan Yaglidere & Swati Padmanabhan & Karin Nielsen & Aydogan Ozcan, 2012. "Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study," PLOS ONE, Public Library of Science, vol. 7(5), pages 1-8, May.
    11. Samuel Arbesman & Nicholas A Christakis, 2011. "Eurekometrics: Analyzing the Nature of Discovery," PLOS Computational Biology, Public Library of Science, vol. 7(6), pages 1-2, June.
    12. Sherwani, Y & Ahmed, M & Muntasir, M & El-Hilly, A & Iqbal, S & Siddiqui, S & Al-Fagih, Z & Usmani, O & Eisingerich, AB, 2015. "Examining the role of gamification and use of mHealth apps in the context of smoking cessation: A review of extant knowledge and outlook," Working Papers 25458, Imperial College, London, Imperial College Business School.
    13. Prpić, John, 2017. "How To Work A Crowd: Developing Crowd Capital Through Crowdsourcing," SocArXiv jer9k_v1, Center for Open Science.
    14. Joanna Chataway & Sarah Parks & Elta Smith, 2017. "How Will Open Science Impact on University-Industry Collaboration?," Foresight and STI Governance (Foresight-Russia till No. 3/2015), National Research University Higher School of Economics, vol. 11(2), pages 44-53.
    15. Ayat Abourashed & Laura Doornekamp & Santi Escartin & Constantianus J. M. Koenraadt & Maarten Schrama & Marlies Wagener & Frederic Bartumeus & Eric C. M. van Gorp, 2021. "The Potential Role of School Citizen Science Programs in Infectious Disease Surveillance: A Critical Review," IJERPH, MDPI, vol. 18(13), pages 1-18, June.
    16. Jennifer Lewis Priestley & Robert J. McGrath, 2019. "The Evolution of Data Science: A New Mode of Knowledge Production," International Journal of Knowledge Management (IJKM), IGI Global, vol. 15(2), pages 97-109, April.
    17. Vito D’Orazio & Michael Kenwick & Matthew Lane & Glenn Palmer & David Reitter, 2016. "Crowdsourcing the Measurement of Interstate Conflict," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-21, June.
    18. Yury Kryvasheyeu & Haohui Chen & Esteban Moro & Pascal Van Hentenryck & Manuel Cebrian, 2015. "Performance of Social Network Sensors during Hurricane Sandy," PLOS ONE, Public Library of Science, vol. 10(2), pages 1-19, February.
    19. Jeffrey Laut & Francesco Cappa & Oded Nov & Maurizio Porfiri, 2015. "Increasing Patient Engagement in Rehabilitation Exercises Using Computer-Based Citizen Science," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-17, March.
    20. Prpić, John, 2017. "How To Work A Crowd: Developing Crowd Capital Through Crowdsourcing," SocArXiv jer9k, Center for Open Science.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0123811. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.