修车大队一品楼qm论坛51一品茶楼论坛,栖凤楼品茶全国楼凤app软件 ,栖凤阁全国论坛入口,广州百花丛bhc论坛杭州百花坊妃子阁

oe1(光电查) - 科学论文

262 条数据
?? 中文(中国)
  • Multivariate feature extraction based supervised machine learning for fault detection and diagnosis in photovoltaic systems

    摘要: Fault detection and diagnosis (FDD) in the photovoltaic (PV) array has become a challenge due to the magnitudes of the faults, the presence of maximum power point trackers, non-linear PV characteristics, and the dependence on isolation efficiency. Thus, the aim of this paper is to develop an improved FDD technique of PV systems faults. The common FDD technique generally has two main steps: feature extraction and selection, and fault classification. Multivariate feature extraction and selection is very important for multivariate statistical systems monitoring. It can reduce the dimension of modeling data and improve the monitoring accuracy. Therefore, in the proposed FDD approach, the principal component analysis (PCA) technique is used for extracting and selecting the most relevant multivariate features and the supervised machine learning (SML) classifiers are applied for faults diagnosis. The FDD performance is established via different metrics using data extracted from different operating conditions of the grid-connected photovoltaic (GCPV) system. The obtained results confirm the feasibility and effectiveness of the proposed approaches for fault detection and diagnosis.

    关键词: fault classification,fault diagnosis,photovoltaic (PV) systems,feature extraction,Supervised machine learning (SML),principal component analysis (PCA)

    更新于2025-09-19 17:13:59

  • Multi‐Resonance Induced Thermally Activated Delayed Fluorophores for Narrowband Green OLEDs

    摘要: With the increment of data scale, distributed machine learning has received more and more attention. However, as the data grows, the dimension of the dataset will increase rapidly, which leads to the increment of the communication traffic in the distributed computing cluster and decreases the performance of the distributed algorithms. This paper proposes a message filtering strategy based on asynchronous alternating direction method of multipliers (ADMM), which can effectively reduce the communication time of the algorithm while ensuring the convergence of the algorithm. In this paper, a soft threshold filtering strategy based on L1 regularization is proposed to filter the parameter of master node, and a gradient truncation filtering strategy is proposed to filter the parameter of slave node. Besides, we update the algorithm asynchronously to reduce the waiting time of the master node. Experiments on large-scale sparse data show that our algorithm can effectively reduce the traffic of messages and make the algorithm reach convergence in a shorter time.

    关键词: gradient truncation,distributed machine learning,message filtering,asynchronous update,ADMM,L1 regularization

    更新于2025-09-19 17:13:59

  • [IEEE 2019 IEEE International Ultrasonics Symposium (IUS) - Glasgow, United Kingdom (2019.10.6-2019.10.9)] 2019 IEEE International Ultrasonics Symposium (IUS) - Photoacoustic Super-Resolution Imaging using Laser Activation of Low-Boiling-Point Dye-Coated Nanodroplets in vitro and in vivo

    摘要: Deep neural networks (DNNs) trained on large data sets have been shown to be able to capture high-quality features describing image data. Numerous studies have proposed various ways to transfer DNN structures trained on large data sets to perform classification tasks represented by relatively small data sets. Due to the limitations of these proposals, it is not well known how to effectively adapt the pre-trained model into the new task. Typically, the transfer process uses a combination of fine-tuning and training of adaptation layers; however, both tasks are susceptible to problems with data shortage and high computational complexity. This paper proposes an improvement to the well-known AlexNet feature extraction technique. The proposed approach applies a recursive neural network structure on features extracted by a deep convolutional neural network pre-trained on a large data set. Object recognition experiments conducted on the Washington RGBD image data set have shown that the proposed method has the advantages of structural simplicity combined with the ability to provide higher recognition accuracy at a low computational cost compared with other relevant methods. The new approach requires no training at the feature extraction phase, and can be performed very efficiently as the output features are compact and highly discriminative, and can be used with a simple classifier in object recognition settings.

    关键词: Machine learning,knowledge transfer,pattern recognition,neural networks

    更新于2025-09-19 17:13:59

  • Property Prediction of Organic Donor Molecules for Photovoltaic Applications Using Extremely Randomized Trees

    摘要: Organic solar cells are an inexpensive, flexible alternative to traditional silicon-based solar cells but disadvantaged by low power conversion efficiency due to empirical design and complex manufacturing processes. This process can be accelerated by generating a comprehensive set of potential candidates. However, this would require a laborious trial and error method of modeling all possible polymer configurations. A machine learning model has the potential to accelerate the process of screening potential donor candidates by associating structural features of the compound using molecular fingerprints with their highest occupied molecular orbital energies. In this paper, extremely randomized tree learning models are employed for the prediction of HOMO values for donor compounds, and a web application is developed. The proposed models outperform neural networks trained on molecular fingerprints as well as SMILES, as well as other state-of-the-art architectures such as Chemception and Molecular Graph Convolution on two datasets of varying sizes.

    关键词: Cheminformatics,Machine Learning,Organic Photovoltaics,Solar Cells

    更新于2025-09-19 17:13:59

  • [IEEE 2019 IEEE 46th Photovoltaic Specialists Conference (PVSC) - Chicago, IL, USA (2019.6.16-2019.6.21)] 2019 IEEE 46th Photovoltaic Specialists Conference (PVSC) - Machine learning defect properties in Cd-based chalcogenides

    摘要: Impurity energy levels in the band gap can have serious consequences for a semiconductor’s performance as a photovoltaic absorber. Data-driven approaches can help accelerate the prediction of point defect properties in common semiconductors, and thus lead to the identification of potential deep lying impurity states. In this work, we use density functional theory (DFT) to compute defect formation energies and charge transition levels of hundreds of impurities in CdX chalcogenide compounds, where X = Te, Se or S. We apply machine learning techniques on the DFT data and develop on-demand predictive models for the formation energy and relevant transition levels of any impurity atom in any site. The trained ML models are general and accurate enough to predict the properties of any possible point defects in any Cd-based chalcogenide, as we prove by testing on a few selected defects in mixed chalcogen compounds CdTe0.5Se0.5 and CdSe0.5S0.5. The ML framework used in this work can be extended to any class of semiconductors.

    关键词: machine learning,point defects,CdTe,density functional theory,chalcogenides

    更新于2025-09-19 17:13:59

  • [IEEE 2019 PhotonIcs & Electromagnetics Research Symposium - Spring (PIERS-Spring) - Rome, Italy (2019.6.17-2019.6.20)] 2019 PhotonIcs & Electromagnetics Research Symposium - Spring (PIERS-Spring) - Coded Metasurface with Optical Activity Based on Broadband Asymmetric Transmission of Linearly Polarized Electromagnetic Waves

    摘要: Learning models of artificial intelligence can nowadays perform very well on a large variety of tasks. However, in practice, different task environments are best handled by different learning models, rather than a single universal approach. Most non-trivial models thus require the adjustment of several to many learning parameters, which is often done on a case-by-case basis by an external party. Meta-learning refers to the ability of an agent to autonomously and dynamically adjust its own learning parameters or meta-parameters. In this paper, we show how projective simulation, a recently developed model of artificial intelligence, can naturally be extended to account for meta-learning in reinforcement learning settings. The projective simulation approach is based on a random walk process over a network of clips. The suggested meta-learning scheme builds upon the same design and employs clip networks to monitor the agent's performance and to adjust its meta-parameters on the fly. We distinguish between reflex-type adaptation and adaptation through learning, and show the utility of both approaches. In addition, a trade-off between flexibility and learning-time is addressed. The extended model is examined on three different kinds of reinforcement learning tasks, in which the agent has different optimal values of the meta-parameters, and is shown to perform well, reaching near-optimal to optimal success rates in all of them, without ever needing to manually adjust any meta-parameter.

    关键词: Machine learning,quantum mechanics,reinforcement learning,adaptive algorithm,random processes,learning,meta-learning

    更新于2025-09-16 10:30:52

  • Machine learning algorithms for predicting the amplitude of chaotic laser pulses

    摘要: Forecasting the dynamics of chaotic systems from the analysis of their output signals is a challenging problem with applications in most fields of modern science. In this work, we use a laser model to compare the performance of several machine learning algorithms for forecasting the amplitude of upcoming emitted chaotic pulses. We simulate the dynamics of an optically injected semiconductor laser that presents a rich variety of dynamical regimes when changing the parameters. We focus on a particular dynamical regime that can show ultrahigh intensity pulses, reminiscent of rogue waves. We compare the goodness of the forecast for several popular methods in machine learning, namely, deep learning, support vector machine, nearest neighbors, and reservoir computing. Finally, we analyze how their performance for predicting the height of the next optical pulse depends on the amount of noise and the length of the time series used for training.

    关键词: chaotic systems,laser pulses,reservoir computing,deep learning,forecasting,support vector machine,machine learning,nearest neighbors

    更新于2025-09-16 10:30:52

  • [IEEE 2019 International Joint Conference on Neural Networks (IJCNN) - Budapest, Hungary (2019.7.14-2019.7.19)] 2019 International Joint Conference on Neural Networks (IJCNN) - Transfer Learning Using Ensemble Neural Networks for Organic Solar Cell Screening

    摘要: Organic Solar Cells are a promising technology for solving the clean energy crisis in the world. However, generating candidate chemical compounds for solar cells is a time-consuming process requiring thousands of hours of laboratory analysis. For a solar cell, the most important property is the power conversion efficiency which is dependent on the highest occupied molecular orbitals (HOMO) values of the donor molecules. Recently, machine learning techniques have proved to be very useful in building predictive models for HOMO values of donor structures of Organic Photovoltaic Cells (OPVs). Since experimental datasets are limited in size, current machine learning models are trained on data derived from calculations based on density functional theory (DFT). Molecular line notations such as SMILES or InChI are popular input representations for describing the molecular structure of donor molecules. The two types of line representations encode different information, such as SMILES defines the bond types while InChi defines protonation. In this work, we present an ensemble deep neural network architecture, called SINet, which harnesses both the SMILES and InChI molecular representations to predict HOMO values and leverage the potential of transfer learning from a sizeable DFT-computed dataset- Harvard CEP to build more robust predictive models for relatively smaller HOPV datasets. Harvard CEP dataset contains molecular structures and properties for 2.3 million candidate donor structures for OPV while HOPV contains DFT-computed and experimental values of 350 and 243 molecules respectively. Our results demonstrate significant performance improvement from the use of transfer learning and leveraging both molecular representations.

    关键词: Organic Solar Cells,InChI,SINet,HOMO values,SMILES,Transfer Learning,Machine Learning

    更新于2025-09-16 10:30:52

  • [IEEE 2019 IEEE Industry Applications Society Annual Meeting - Baltimore, MD, USA (2019.9.29-2019.10.3)] 2019 IEEE Industry Applications Society Annual Meeting - Assessing the Modelling Approach and Datasets Required for Fault Detection in Photovoltaic Systems

    摘要: Reliable monitoring for photovoltaic assets (PVs) is essential to ensuring uptake, long term performance, and maximum return on investment of renewable systems. To this end this paper investigates the input data and machine learning techniques required for day-behind predictions of PV generation, within the scope of conducting informed maintenance of these systems. Five years of PV generation data at hourly intervals were retrieved from four commercial building-mounted PV installations in the UK, as well as weather data retrieved from MIDAS. A support vector machine, random forest and artificial neural network were trained to predict PV power generation. Random forest performed best, achieving an average mean relative error of 2.7%. Irradiance, previous generation and solar position were found to be the most important variables. Overall, this work shows how low-cost data driven analysis of PV systems can be used to support the effective management of such assets.

    关键词: weather data,random forest,machine learning,photovoltaics,Fault detection

    更新于2025-09-16 10:30:52

  • Cross-predicting the dynamics of an optically injected single-mode semiconductor laser using reservoir computing

    摘要: In real-world dynamical systems, technical limitations may prevent complete access to their dynamical variables. Such a lack of information may cause significant problems, especially when monitoring or controlling the dynamics of the system is required or when decisions need to be taken based on the dynamical state of the system. Cross-predicting the missing data is, therefore, of considerable interest. Here, we use a machine learning algorithm based on reservoir computing to perform cross-prediction of unknown variables of a chaotic dynamical laser system. In particular, we chose a realistic model of an optically injected single-mode semiconductor laser. While the intensity of the laser can often be acquired easily, measuring the phase of the electric field and the carriers in real time, although possible, requires a more demanding experimental scheme. We demonstrate that the dynamics of two of the three dynamical variables describing the state of the laser can be reconstructed accurately from the knowledge of only one variable, if our algorithm has been trained beforehand with all three variables for a limited period of time. We analyze the accuracy of the method depending on the parameters of the laser system and the reservoir. Finally, we test the robustness of the cross-prediction method when adding noise to the time series. The suggested reservoir computing state observer might be used in many applications, including reconstructing time series, recovering lost time series data and testing data encryption security in cryptography based on chaotic synchronization of lasers.

    关键词: reservoir computing,chaotic dynamics,cross-prediction,machine learning,semiconductor laser

    更新于2025-09-16 10:30:52