Loading…

Self-attention-based neural networks for refining the overlength product titles

Online sellers often produce redundant and lengthy product textual titles with extra information on e-commerce platforms to attract the attentions of customers. Such overlength product titles become a problem when they are displayed on mobile applications. In this paper, the problem of refining redu...

Full description

Saved in:
Bibliographic Details
Published in:Multimedia tools and applications 2021-07, Vol.80 (18), p.28501-28519
Main Authors: Lin, Yuming, Fu, Yu, Li, You, Cai, Guoyong, Zhou, Aoying
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Online sellers often produce redundant and lengthy product textual titles with extra information on e-commerce platforms to attract the attentions of customers. Such overlength product titles become a problem when they are displayed on mobile applications. In this paper, the problem of refining redundant and overlength product titles is studied to generate concise and informative titles. First, the task of refining the long title is transformed into a sequential classification problem by predicting whether a word in original title will remain in finial short title. Then, a self-attention-based neural network is proposed to extract the most informative words from original title to construct the short title. The proposed basic model is also extended with a gated recurrent unit (GRU) neural network and a gating mechanism to improve the position encoding process and learn the weights of encoding features from different directions. Moreover, an algorithm is designed to construct the datasets for redundant product title compression analysis based on the open dataset LESD4EC. Finally, extensive experiments are implemented on the rebuilt datasets to demonstrate the effectiveness and efficiency of the proposed methods. The experimental results show that the proposed methods significantly outperform the state-of-the-art methods based on the precision, recall, F 1 value and the mean absolute error, as well as runtime and space cost.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-021-10908-x