<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="static/style.xsl"?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-18T07:47:16Z</responseDate><request verb="GetRecord" identifier="oai:www.recercat.cat:10230/35940" metadataPrefix="oai_dc">https://recercat.cat/oai/request</request><GetRecord><record><header><identifier>oai:recercat.cat:10230/35940</identifier><datestamp>2025-12-21T18:09:15Z</datestamp><setSpec>com_2072_6</setSpec><setSpec>col_2072_452952</setSpec></header><metadata><oai_dc:dc xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
   <dc:title>Interpretable emoji prediction via label-wise attention LSTMs</dc:title>
   <dc:creator>Barbieri, Francesco</dc:creator>
   <dc:creator>Espinosa-Anke, Luis</dc:creator>
   <dc:creator>Camacho-Collados, Jose</dc:creator>
   <dc:creator>Schockaert, Steven</dc:creator>
   <dc:creator>Saggion, Horacio</dc:creator>
   <dc:description>Comunicació presentada a la Conference on Empirical Methods in Natural Language Processing, celebrada del 31 d&amp;apos;octubre al 4 de novembre de 2018 a Brussel·les, Bèlgica.</dc:description>
   <dc:description>Human language has evolved towards newer&#xd;
forms of communication such as social media,&#xd;
where emojis (i.e., ideograms bearing a&#xd;
visual meaning) play a key role. While there&#xd;
is an increasing body of work aimed at the&#xd;
computational modeling of emoji semantics,&#xd;
there is currently little understanding about&#xd;
what makes a computational model represent&#xd;
or predict a given emoji in a certain way. In&#xd;
this paper we propose a label-wise attention&#xd;
mechanism with which we attempt to better&#xd;
understand the nuances underlying emoji prediction.&#xd;
In addition to advantages in terms&#xd;
of interpretability, we show that our proposed&#xd;
architecture improves over standard baselines&#xd;
in emoji prediction, and does particularly well&#xd;
when predicting infrequent emojis.</dc:description>
   <dc:description>F. Barbieri and H. Saggion acknowledge support&#xd;
from the TUNER project (TIN2015-65308-C5-5-&#xd;
R, MINECO/FEDER, UE). Luis Espinosa-Anke,&#xd;
Jose Camacho-Collados and Steven Schockaert&#xd;
have been supported by ERC Starting Grant&#xd;
637277.</dc:description>
   <dc:date>2018-12-03T10:51:43Z</dc:date>
   <dc:date>2018-12-03T10:51:43Z</dc:date>
   <dc:date>2018</dc:date>
   <dc:type>info:eu-repo/semantics/conferenceObject</dc:type>
   <dc:type>info:eu-repo/semantics/publishedVersion</dc:type>
   <dc:identifier>Barbieri F, Espinosa-Anke L, Camacho-Collados J, Schockaert S, Saggion H. Interpretable emoji prediction via label-wise attention LSTMs. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing; 2018 Oct 31-Nov 4; Brussels, Belgium. New York: Association for Computational Linguistics; 2018. p. 4766-71.</dc:identifier>
   <dc:identifier>978-1-948087-84-1</dc:identifier>
   <dc:identifier>1530-9312</dc:identifier>
   <dc:identifier>http://hdl.handle.net/10230/35940</dc:identifier>
   <dc:language>eng</dc:language>
   <dc:relation>Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing; 2018 Oct 31-Nov 4; Brussels, Belgium. New York: Association for Computational Linguistics; 2018.</dc:relation>
   <dc:relation>info:eu-repo/grantAgreement/ES/1PE/TIN2015-65308-C5-5-R</dc:relation>
   <dc:rights>© ACL, Creative Commons Attribution 4.0 License</dc:rights>
   <dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
   <dc:format>application/pdf</dc:format>
   <dc:format>application/pdf</dc:format>
   <dc:publisher>ACL (Association for Computational Linguistics)</dc:publisher>
</oai_dc:dc></metadata></record></GetRecord></OAI-PMH>