start-ver=1.4 cd-journal=joma no-vol= cd-vols= no-issue= article-no= start-page=390 end-page=395 dt-received= dt-revised= dt-accepted= dt-pub-year=2015 dt-pub=201503 dt-online= en-article= kn-article= en-subject= kn-subject= en-title= kn-title=Sound collection and visualization system enabled participatory and opportunistic sensing approaches en-subtitle= kn-subtitle= en-abstract= kn-abstract=This paper presents a sound collection system to visualize environmental sounds that are collected using a crowd-sourcing approach. An analysis of physical features is generally used to analyze sound properties; however, human beings not only analyze but also emotionally connect to sounds. If we want to visualize the sounds according to the characteristics of the listener, we need to collect not only the raw sound, but also the subjective feelings associated with them. For this purpose, we developed a sound collection system using a crowdsourcing approach to collect physical sounds, their statistics, and subjective evaluations simultaneously. We then conducted a sound collection experiment using the developed system on ten participants.We collected 6,257 samples of equivalent loudness levels and their locations, and 516 samples of sounds and their locations. Subjective evaluations by the participants are also included in the data. Next, we tried to visualize the sound on a map. The loudness levels are visualized as a color map and the sounds are visualized as icons which indicate the sound type. Finally, we conducted a discrimination experiment on the sound to implement a function of automatic conversion from sounds to appropriate icons. The classifier is trained on the basis of the GMM-UBM (Gaussian Mixture Model and Universal Background Model) method. Experimental results show that the F-measure is 0.52 and the AUC is 0.79. en-copyright= kn-copyright= en-aut-name=HaraSunao en-aut-sei=Hara en-aut-mei=Sunao kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=1 ORCID= en-aut-name=AbeMasanobu en-aut-sei=Abe en-aut-mei=Masanobu kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=2 ORCID= en-aut-name=SoneharaNoboru en-aut-sei=Sonehara en-aut-mei=Noboru kn-aut-name= kn-aut-sei= kn-aut-mei= aut-affil-num=3 ORCID= affil-num=1 en-affil= kn-affil=Graduate School of Natural Science and Technology Okayama University affil-num=2 en-affil= kn-affil=Graduate School of Natural Science and Technology Okayama University affil-num=3 en-affil= kn-affil=National Institute of Informatics END