Journal of Textile Research ›› 2022, Vol. 43 ›› Issue (08): 197-205.doi: 10.13475/j.fzxb.20210107509
• Comprehensive Review • Previous Articles Next Articles
LIU Huanhuan1,2,3, WANG Zhaohui1,2,3(), YE Qinwen1,2, CHEN Ziwei1,2, ZHENG Jingjin1,2
CLC Number:
[1] |
SINGH R R, CONJETI S, BANERJEE R. A comparative evaluation of neural network classifiers for stress level analysis of automotive drivers using physiological signals[J]. Biomedical Signal Processing and Control, 2013, 8(6): 740-754.
doi: 10.1016/j.bspc.2013.06.014 |
[2] | 黄天蓝. 可穿戴设备的服装结构与工艺研究[J]. 美与时代, 2017(5):102-106. |
HUANG Tianlan. Wearable clothing structure and technology research[J]. Beauty and Times, 2007(5): 102-106. | |
[3] |
SCHMIDT P, REISS A, DURICHEN R, et al. Wearable-based affect recognition:a review[J]. Sensors, 2019, 19(19):4079-4121.
doi: 10.3390/s19194079 |
[4] |
EKMAN P, LEVENSON R W, FRIESEN W V. Autonomic nervous system activity distinguishes among emotions[J]. Science, 1983, 221(4616): 1208-1210.
doi: 10.1126/science.6612338 |
[5] |
RUSSEL J A. Mixed emotions viewed from the psychological constructionist perspective[J]. Emotion Review, 2017, 9(2): 111-117.
doi: 10.1177/1754073916639658 |
[6] |
MCCORRY L K. Physiology of the autonomic nervous system[J]. American Journal of Pharmaceutical Education, 2007, 71(4): 11.
doi: 10.5688/aj710111 |
[7] |
KREIBIG S D. Autonomic nervous system activity in emotion: a review[J]. Biological Psychology, 2010, 84(3): 394-421.
doi: 10.1016/j.biopsycho.2010.03.010 |
[8] | 吴学奎, 任立红, 丁永生. 面向智能服装的多生理信息融合的情绪判别[J]. 计算机工程与应用, 2009, 45(33): 218-221. |
WU Xuekui, REN Lihong, DING Yongsheng. Multi-physiology information fusion for emotion distinction in smart clothing[J]. Computer Engineering and Applications, 2009, 45(33): 218-221.
doi: 10.3778/j.issn.1002-8331.2009.33.069 |
|
[9] |
ALAM M G R, ABEDIN S F, MOON S I, et al. Healthcare IoT-based affective state mining using a deep convolutional neural network[J]. IEEE Access, 2019, 7:75189-75202.
doi: 10.1109/ACCESS.2019.2919995 |
[10] |
PICARD P W, HEALEY J. Affective wearables[J] Personal Technologies, 1997, 1(4): 231-240.
doi: 10.1007/BF01682026 |
[11] | D'MELLO S K, KORY J. A review and meta-analysis of multimodal affect detection systems[J]. Acm Computing Surveys, 2015, 47(3):36. |
[12] |
PICARD R W, VYZAS E, HEALEY J. Toward machine emotional intelligence: analysis of affective physiological state[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2001, 23(10): 1175-1191.
doi: 10.1109/34.954607 |
[13] | KHAN A M, LAWO M. Developing a system for recognizing the emotional states using physiological devices[C]//12th International Conference on Intelligent Environments. London: IEEE, 2016: 48-53. |
[14] |
VALENZA G C, GENTILI C, LANATA A, et al. Mood recognition in bipolar patients through the PSYCHE platform: preliminary evaluations and perspectives[J]. Artificial Intelligence in Medicine, 2013, 57(1): 49-58.
doi: 10.1016/j.artmed.2012.12.001 |
[15] |
SOLEYMANI M, LICHTENAUER J, PUN T, et al. A multimodal database for affect recognition and implicit tagging[J]. IEEE Transactions on Affective Computing, 2013, 3(1):42-55.
doi: 10.1109/T-AFFC.2011.25 |
[16] | MAHDIANI S, JEYHANI V, PELTOKANGAS, et al. Is 50 Hz high enough ECG sampling frequency for accurate HRV analysis? [C]//IEEE Engineering in Medicine and Biology Society. Milan: IEEE, 2015: 5948-5951. |
[17] |
ZHU J P, JI L Z, JI C Y, et al. Heart rate variability monitoring for emotion and disorders of emotion[J]. Physiological Measurement, 2019. DOI: 10.1088/1361-6579/AB1887.
doi: 10.1088/1361-6579/AB1887 |
[18] |
ZANGRONIZ R, MARTINEZ-RODRIGO A, PASTOR J M, et al. Electrodermal activity sensor for classification of calm/distress condition[J]. Sensors, 2017. DOI: 10.3390/s17102324.
doi: 10.3390/s17102324 |
[19] |
DOMINGUEZ-JIMENEZ J A, CAMPO-LANDINES K C, MARTINEZ-SANTOS J C, et al. A machine learning model for emotion recognition from physiological sig-nals[J]. Biomedical Signal Processing and Control, 2020. DOI: 10.1016/j. bspc. 2019. 101646.
doi: 10.1016/j. bspc |
[20] |
HEALEY J A, PICARD R W. Detecting stress during real-world driving tasks using physiological sensors[J]. IEEE Transactions on Intelligent Transportation Systems, 2005, 6(2): 156-166.
doi: 10.1109/TITS.2005.848368 |
[21] |
LYKKEN D T, VENABLES P H. Direct measurement of skin conductance: a proposal for standardization[J]. Psychophysiology, 1971, 8(5): 656-672.
doi: 10.1111/j.1469-8986.1971.tb00501.x |
[22] |
FENG H H, GOLSHAN H M, MAHOOR M H. A wavelet-based approach to emotion classification using EDA signals[J]. Expert Systems with Applications, 2018, 112: 77-86.
doi: 10.1016/j.eswa.2018.06.014 |
[23] |
JANG E H, PARK B J, PARK M S, et al. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions[J]. Journal of Physiological Anthropology, 2015. DOI: 10.1186/s40101-015-0063-5.
doi: 10.1186/s40101-015-0063-5 |
[24] |
ZHANG Q, CHEN X X, ZHAN Q Y, et al. Respiration-based emotion recognition with deep lear-ning[J]. Computers in Industry, 2017, 92/93: 84-90.
doi: 10.1016/j.compind.2017.04.005 |
[25] |
DAR M N, AKRAM M U, KHAWAJA S G, et al. CNN and LSTM-based emotion charting using physiological signals[J]. Sensors, 2020, 20(16): 26.
doi: 10.3390/s20010026 |
[26] |
SOLEYMANI M, PANTIC M, PUN T. Multimodal emotion recognition in response to videos[J]. IEEE Transactions on Affective Computing, 2012, 3(2): 211-223.
doi: 10.1109/T-AFFC.2011.37 |
[27] |
SARIYANIDI E, GUNES H, CAVALLARO A. Automatic analysis of facial affect: a survey of registration, representation, and recognition[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(6): 1113-1133.
doi: 10.1109/TPAMI.2014.2366127 |
[28] |
NAG A, HABER H, VOSS C, et al. Toward continuous social phenotyping: analyzing gaze patterns in an emotion recognition task for children with autism through wearable smart glasses[J]. Journal of Medical Internet Research, 2020.DOI: 10.2196/preprint.13810.
doi: 10.2196/preprint.13810 |
[29] |
KATSIS C D, GOLETSIS Y, RIGAS G, et al. A wearable system for the affective monitoring of car racing drivers during simulated conditions[J]. Transportation Research Part C:Emerging Technologies, 2011, 19(3): 541-551.
doi: 10.1016/j.trc.2010.09.004 |
[30] |
DORES A R, BARBOSA F, QUEIROS C, et al. Recognizing emotions through facial expressions: a largescale experimental study[J]. International Journal of Environmental Research and Public Health, 2020, 17(20): 7240-7253.
doi: 10.3390/ijerph17197240 |
[31] | 崔莉庆, 李舜, 朱廷劭. 移动可穿戴设备的数据挖掘:步态中的情绪识别[C]//第十八届全国心理学学术会议摘要集——心理学与社会发展. 天津: [出版者不详], 2015:623-624. |
CUI Liqing, LI Shun, ZHU Tingzhao. Emotion detection based on natural gaits recorded by wearable devices[C]//Abstracts of the 18th National Psychology Conference:Psychology and Social Development. Tianjin: [s.n.], 2015:623-624. | |
[32] | KAMINSKA D, SAPINSKI T, ANBARJAFARI G. Efficiency of chosen speech descriptors in relation to emotion recognition[J]. Eurasip Journal on Audio Speech and Music Processing, 2017(1): 3-12. |
[33] | 韩文静, 李海峰, 阮华斌, 等语音情感识别研究进展综述[J]. 软件学报, 2014, 25(1):37-50. |
HAN Wenjing, LI Haifeng, RUN Huabin, et al. Review on speech emotion recognition.[J]. Journal of Software, 2014, 25(1): 37-50. | |
[34] | NALEPA G J, KUTT K, BOBEK S. Mobile platform for affective context-aware systems[J]. Future Generation Computer Systems:the International Journal of Escience, 2019, 92:490-503. |
[35] | ALAJMI N, KANJO E, MAWASS N E, et al. Shopmobia: an emotion-based shop rating system[C]//Humaine Association Conference on Affective Computing and Intelligent Interaction. Geneva: IEEE, 2013: 745-750. |
[36] | FERNANDEZ-DELGADO M, CERNADAS E, BARRO S, et al. Do we need hundreds of classifiers to solve real world classification problems?[J]. Journal of Machine Learning Research, 2014, 15: 3133-3181. |
[37] | RUBIN J. Time, frequency & complexity analysis for recognizing panic states from physiologic time series[C]//10th EAI International Conference on Pervasive Computing Technologies for Healthcare. UK: ACM, 2016:81-88. |
[38] | FRIEDMAN J, HASTIE T, TIBSHIRANI R. Additive logistic regression: a statistical view of boosting[J]. Annals of Statistics, 2000, 28(2): 337-374. |
[39] | MOZOSO M, SANDULESCU V, ANDREWS S, et al. Stress detection using wearable physiological and sociometric sensors[J]. International Journal of Neural Systems, 2017, 27(2): 16. |
[40] | WAGNER J, KIM J, ANDRE E, et al. From physiological signals to emotions: implementing and comparing selected methods for feature extraction and classification[C]//2005 IEEE International Conference on Multimedia and Expo. Amsterdam: IEEE, 2005: 941-944. |
[41] | KIM D, SEO Y, CHO J, et al. Detection of subjects with higher self-reporting stress scores using heart rate variability patterns during the day[C]//IEEE Engineering in Medicine and Biology Society Conference Proceedings. Vancouver: IEEE, 2008: 682-685. |
[42] |
CHANEL G, KIERKELS J M, SOLEYMANI M, et al. Short-term emotion assessment in a recall paradigm[J]. International Journal of Human-Computer Studies, 2009, 67(8): 607-627.
doi: 10.1016/j.ijhcs.2009.03.005 |
[43] |
VALENZA G, LANATA A, SCILINGO P E. The role of nonlinear dynamics in affective valence and arousal recognition[J]. IEEE Transactions on Affective Computing, 2012, 3(2): 237-249.
doi: 10.1109/T-AFFC.2011.30 |
[44] |
MARTINEZ H P, BENGIO Y, YANNAKAKIS N G. Learning deep physiological models of affect[J]. IEEE Computational Intelligence Magazine, 2013, 8(2): 20-33.
doi: 10.1109/MCI.2013.2247823 |
[45] |
ABADI M K, SUBRAMANIAN R, KIA M S, et al. Decaf: meg-based multimodal database for decoding affective physiological responses[J]. IEEE Transactions on Affective Computing, 2015, 6(3): 209-222.
doi: 10.1109/TAFFC.2015.2392932 |
[46] | RATHOD P, GEORGE K, SHINDE N, et al. Bio-signal based emotion detection device[C]//IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks. San Francisco: IEEE, 2016: 105-108. |
[47] | BIRJANDTALAB J, COGAN D, POUYAN B M, et al. A non-EEG biosignals dataset for assessment and visualization of neurological status[C]//IEEE international workshop on signal processing systems. Dallas: IEEE, 2016: 110-114. |
[48] | GIRARDI D, LANUBILE F, NOVIELLI N, et al. Emotion detection using noninvasive low cost sen-sors[C]//Seventh International Conference on Affective Computing and Intelligent Interaction. San Antonio: IEEE, 2017:125-130. |
[49] | ZHAO B B, WANG Z, YU W Z, et al. Emotionsense: emotion recognition based on wearable wristband[C]//IEEE Smartworld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation. Guangzhou: IEEE, 2018:346-355. |
[50] |
SANTAMARIA-GRANADOS L, MUNOZ-ORGANERO M, RAMIREZ-GONZALEZ G, et al. Using deep convolutional neural network for emotion detection on a physiological signals dataset (AMIGOS)[J]. IEEE Access, 2019, 7: 57-67.
doi: 10.1109/ACCESS.2018.2883213 |
[51] |
HASSAN M M, ALAM M G R, UDDIN M Z, et al. Human emotion recognition using deep belief network architecture[J]. Information Fusion, 2019, 51: 10-18.
doi: 10.1016/j.inffus.2018.10.009 |
[52] | DI LASCIO E, GASHI S, SANTINI S, et al. Laughter recognition using non-invasive wearable devices[C]// 13th EAI International Conference on Pervasive Computing Technologies for Healthcare. Trento: ACM, 2019: 262-271. |
[53] | HEINISCH J S, ANDERSON C, DAVID K, et al. Angry or climbing stairs? towards physiological emotion recognition in the wild[C]//IEEE International Conference on Pervasive Computing and Communications Workshops. Kyoto: IEEE, 2019:486-491. |
[54] |
YANG J, WANG R, GUAN X, et al. AI-enabled emotion-aware robot: the fusion of smart clothing, edge clouds and robotics[J]. Future Generation Computer Systems, 2020, 102: 701-709.
doi: 10.1016/j.future.2019.09.029 |
[55] | KWON J, KIM D H, PARK W, et al. A wearable device for emotional recognition using facial expression and physiological response[C]//38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Orlando: IEEE, 2016: 5765-5768. |
[56] | 徐诗怡. 老年人情绪监测智能手环创新设计研究[D]. 西安: 西安理工大学, 2019: 56-68. |
XU Shiyi. Research on innovative design of emotional monitoring interligent bracelet for the aged[D]. Xi'an: Xi'an University of Technology, 2019:56-68. | |
[57] | YANG J, ZHOU J, TAO G M, et al. Wearable 3.0: from smart clothing to wearable affective robot[J]. Network, 2019, 33(6): 8-14. |
[58] | LANATA A, VALENZA G, NARDELLI M, et al. Complexity index from a personalized wearable monitoring system for assessing remission in mental health[J]. Journal of Biomedical and Health Informatics, 2015, 19(1): 132-139. |
[59] | 王晓昕. 仿生色彩设计应用研究[J]. 数位时尚(新视觉艺术), 2011 (1): 89-90. |
WANG Xiaoxin. A study on the application of bionic colour design[J]. New Vision Art, 2011(1): 89-90. |
[1] | SUN Chunhong, DING Guangtai, FANG Kun. Cashmere and wool classification based on sparse dictionary learning [J]. Journal of Textile Research, 2022, 43(04): 28-32. |
[2] | YU Rufang, HONG Xinghua, ZHU Chengyan, JIN Zimin, WAN Junmin. Electrical heating properties of fabrics coated by reduced graphene oxide [J]. Journal of Textile Research, 2021, 42(10): 126-131. |
[3] | YAN Tao, PAN Zhijuan. Strain sensing performance for thin and aligned carbon nanofiber membrane [J]. Journal of Textile Research, 2021, 42(07): 62-68. |
[4] | WANG Hongcheng, LANG Runnan, WANG Fangfang, XU Fengyu, SHEN Jingjin. Prediction model and analysis of foot-ground reaction force based on pressure insole [J]. Journal of Textile Research, 2019, 40(11): 175-181. |
[5] | YANG Jingzhao, JIANG Xiuming, DONG Jiuzhi, CHEN Yunjun, MEI Baolong. Prediction method of integrated piercing pressure parameters based on machine learning [J]. Journal of Textile Research, 2019, 40(08): 157-163. |
|