News
As a typical deep network, stacked autoencoder (SAE) has an outstanding modeling capability in soft sensors due to its ability to extract deep features. However, SAE ignores the expanded ...
Serial-autoencoder for personalized recommendation. Higher Education Press . Journal Frontiers of Computer Science DOI 10.1007/s11704-023-2441-1 ...
Compared to SAE, the idea of SDGAE lies in stacking a series of hierarchical dual-guided autoencoder (DGAE), aiming to allow each DGAE to accurately reconstruct the original input data. It also ...
Normalizing and Encoding Source Data for an Autoencoder In practice, preparing the source data for an autoencoder is the most time-consuming part of the dimensionality reduction process. To normalize ...
Data-driven soft sensors play an important role in practical processes and have been widely applied. They provide real-time prediction of quality variables and then guide production and improve ...
For the State predictor variable with 3 possible values, the encoding is Michigan = 100, Nebraska = 010, Oklahoma = 001. If the State variable had an additional possible value, say Pennsylvania, the ...
The multivariate landslide dataset was used as both the input and output to train the stacked autoencoder algorithm. Subsequently, in the central latent vector of the stacked autoencoder, the Fuzzy ...
First, for each latent variable, we find out the maximum value that it occupies across a set of randomly selected trials. We then change that specific latent to achieve its maximum value, and this new ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results