Activity

  • Lacroix Bowen posted an update 2 weeks, 1 day ago

    reliable dominant options (i.e., parameter sets) for practical hydrological forecasting in the study area. The entropy-based method has been proved that it is effective to analyze and evaluate the performance of different combinations of objective functions and can provide more comprehensive and impersonal decision support for hydrological forecasting. Copyright © 2020 Jiuyuan Huo and Liqun Liu.Responsive EEG-based communication systems have been implemented with brain-computer interfaces (BCIs) based on code-modulated visual evoked potentials (c-VEPs). The BCI targets are typically encoded with binary m-sequences because of their autocorrelation property; the digits one and zero correspond to different target colours (usually black and white), which are updated every frame according to the code. While binary flickering patterns enable high communication speeds, they are perceived as annoying by many users. Quintary (base 5) m-sequences, where the five digits correspond to different shades of grey, may yield a more subtle visual stimulation. This study explores two approaches to reduce the flickering sensation (1) adjusting the flickering speed via refresh rates and (2) applying quintary codes. In this respect, six flickering modalities are tested using an eight-target spelling application binary patterns and quintary patterns generated with 60, 120, and 240 Hz refresh rates. This study was conducted with 18 nondisabled participants. For all six flickering modalities, a copy-spelling task was conducted. According to questionnaire results, most users favoured the proposed quintary over the binary pattern while achieving similar performance to it (no statistical differences between the patterns were found). Mean accuracies across participants were above 95%, and information transfer rates were above 55 bits/min for all patterns and flickering speeds. Copyright © 2020 Felix W. Gembler et al.We propose three quality control (QC) techniques using machine learning that depend on the type of input data used for training. These include QC based on time series of a single weather element, QC based on time series in conjunction with other weather elements, and QC using spatiotemporal characteristics. We performed machine learning-based QC on each weather element of atmospheric data, such as temperature, acquired from seven types of IoT sensors and applied machine learning algorithms, such as support vector regression, on data with errors to make meaningful estimates from them. By using the root mean squared error (RMSE), we evaluated the performance of the proposed techniques. As a result, the QC done in conjunction with other weather elements had 0.14% lower RMSE on average than QC conducted with only a single weather element. In the case of QC with spatiotemporal characteristic considerations, the QC done via training with AWS data showed performance with 17% lower RMSE than QC done with only raw data. Copyright © 2020 Hye-Jin Kim et al.In recent years, cloud computing technology has attracted extensive attention from both academia and industry. The popularity of cloud computing was originated from its ability to deliver global IT services such as core infrastructure, platforms, and applications to cloud customers over the web. Furthermore, it promises on-demand services with new forms of the pricing package. However, cloud job scheduling is still NP-complete and became more complicated due to some factors such as resource dynamicity and on-demand consumer application requirements. To fill this gap, this paper presents a modified Harris hawks optimization (HHO) algorithm based on the simulated annealing (SA) for scheduling jobs in the cloud environment. Selleckchem SN-38 In the proposed HHOSA approach, SA is employed as a local search algorithm to improve the rate of convergence and quality of solution generated by the standard HHO algorithm. The performance of the HHOSA method is compared with that of state-of-the-art job scheduling algorithms, by having them all implemented on the CloudSim toolkit. Both standard and synthetic workloads are employed to analyze the performance of the proposed HHOSA algorithm. The obtained results demonstrate that HHOSA can achieve significant reductions in makespan of the job scheduling problem as compared to the standard HHO and other existing scheduling algorithms. Moreover, it converges faster when the search space becomes larger which makes it appropriate for large-scale scheduling problems. Copyright © 2020 Ibrahim Attiya et al.Recent technological advances have enabled researchers to collect large amounts of electroencephalography (EEG) signals in labeled and unlabeled datasets. It is expensive and time consuming to collect labeled EEG data for use in brain-computer interface (BCI) systems, however. In this paper, a novel active learning method is proposed to minimize the amount of labeled, subject-specific EEG data required for effective classifier training, by combining measures of uncertainty and representativeness within an extreme learning machine (ELM). Following this approach, an ELM classifier was first used to select a relatively large batch of unlabeled examples, whose uncertainty was measured through the best-versus-second-best (BvSB) strategy. The diversity of each sample was then measured between the limited labeled training data and previously selected unlabeled samples, and similarity is measured among the previously selected samples. Finally, a tradeoff parameter is introduced to control the balance between informative and representative samples, and these samples are then used to construct a powerful ELM classifier. Extensive experiments were conducted using benchmark and multiclass motor imagery EEG datasets to evaluate the efficacy of the proposed method. Experimental results show that the performance of the new algorithm exceeds or matches those of several state-of-the-art active learning algorithms. It is thereby shown that the proposed method improves classifier performance and reduces the need for training samples in BCI applications. Copyright © 2020 Qingshan She et al.Fuzzy c-means (FCM) is one of the best-known clustering methods to organize the wide variety of datasets automatically and acquire accurate classification, but it has a tendency to fall into local minima. For overcoming these weaknesses, some methods that hybridize PSO and FCM for clustering have been proposed in the literature, and it is demonstrated that these hybrid methods have an improved accuracy over traditional partition clustering approaches, whereas PSO-based clustering methods have poor execution time in comparison to partitional clustering techniques, and the current PSO algorithms require tuning a range of parameters before they are able to find good solutions. Therefore, this paper introduces a hybrid method for fuzzy clustering, named FCM-ELPSO, which aim to deal with these shortcomings. It combines FCM with an improved version of PSO, called ELPSO, which adopts a new enhanced logarithmic inertia weight strategy to provide better balance between exploration and exploitation. This new hybrid method uses PBM(F) index and the objective function value as cluster validity indexes to evaluate the clustering effect.