KEYWORDS: Education and training, Data modeling, Random forests, Transformers, Process modeling, Machine learning, Parallel processing, Feature selection, Detection and tracking algorithms, Performance modeling
Browser fingerprinting has been used as a user-tracking technique in recent years. As a long-term tracking technique, it requires not only obtaining unique browser fingerprints but also linking fingerprints from the same browser instance in that browsers change rapidly and frequently. To improve the efficiency of linking the evolving browser fingerprints, in this paper, we propose a browser fingerprint linking method based on Transformer-encoder. Transformer-encoder utilizes an attention mechanism to focus on certain parts of the input sequence, enabling it to capture complex connections and interactions within the data more efficiently. To make the most of the parallel processing mechanism of the Transformer-encoder, we combine multiple fingerprint comparison vectors into an input vector to train the model. We conduct extensive experiments on a public dataset to evaluate our proposed model. The experimental results show that our model outperforms some existing models, which proves the effectiveness of the Transformer-encoder in linking browser fingerprints.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.