Spaces:
Sleeping
Sleeping
[ | |
{ | |
"path": "table_paper/2407.00082v1.json", | |
"table_id": "5", | |
"section": "5.4", | |
"all_context": [ | |
"The influence of clustering As mentioned previously, the clustering module can effectively achieve semantic matching at a coarse-grained level.", | |
"Therefore, we partition the dataset into four subsets that do not overlap each other to train the model to achieve the following four tasks: 1) existing jobs for existing users, 2) existing jobs for users who have just revised resumes, 3) new jobs for existing users, and 4) new jobs for users who have just revised resumes.", | |
"The comparison results are shown in Table 5.4 .", | |
"[ tabular=ccccc, table head= Ours Non-clustering H@10 M@10 H@10 M@10 , late after line= , late after last line= ]data/exp_clustering.txt\\csvlinetotablerow Compared to the framework without a clustering module, BISTRO is better adapted to semantic matching: neither the HR nor the MRR suffers from a significant drop, which shows the efficiency of the clustering module.", | |
"The effect of hyperedges The construction of hyperedges in BISTRO can increase the density of the interaction graph, which could add more structural information to address the preference drift issue.", | |
"To verify this idea, we use the following formula to compute the density of an undirected graph: .", | |
"As shown in Table 5.4 , the density of the graph is doubled when we add all types of hyperedges and the experimental results have also been improved due to the optimization of the structure in the graph.", | |
"[ tabular=cccc, table head= Density H@10 M@10 , late after line= , late after last line= ]data/exp_hyperedge.txt\\csvlinetotablerow Non-Noise Add 10% Noise Add 20% Noise Add 50% Noise The validity of hypergraph wavelet filter In BISTRO, we design a novel hypergraph wavelet learning method.", | |
"In this learning method, a wavelet filter is deployed for data denoising as well as fine-grained job preference feature extraction.", | |
"As shown in Figure LABEL:fig:exp_filter_line, the curves illustrate the results of three models, which have different filtering settings, under different percentages of noise in the data.", | |
"We can also visualize from it that our method, BISTRO, has the smoothest decrease in model performance as the proportion of noise in the data increases.", | |
"To vividly show the denoising capability of the proposed hypergraph wavelet filter, we randomly select a user who is active in a week, filter the 50 most recent interactions from three job categories, and construct an interaction graph.", | |
"In this graph, each node represents a job the user has engaged with, interconnected by grey dotted lines, while the interaction sequence of the user is depicted with grey edges.", | |
"On this basis, we introduce noisy jobs (marked with orange crosses) and their corresponding interactions (denoted by orange edges and dotted lines) to mimic the effect of a user accidentally clicking on unrelated job types.", | |
"Given that each model generates job preference representations for diverse jobs, we visualize the connections between the user and jobs, as well as the relationships among jobs themselves, as shown in Figure 4 .", | |
"We eliminate edges whose cosine similarity between job representation pairs fell below a uniform threshold and remove links between isolated jobs and the user.", | |
"Consequently, a graph with more orange lines indicates lower model performance.", | |
"Notably, as data noise levels escalated, the comparative models demonstrated diminished noise filtering effectiveness relative to our proposed approach.", | |
"Specifically, the random walk-based method significantly underperformed compared to the spectral GCN method, primarily due to the ability of spectral graph neural networks to filter out irrelevant interaction features.", | |
"Furthermore, our approach employs a wavelet kernel to create a set of sub-filters, adeptly denoising by dynamically selecting appropriate filters for the user s evolving characteristics.", | |
"The size of user (job) groups The size of user and job groups are two hyperparameters that need to be predefined.", | |
"Therefore, we choose 500:1, 1000:1, and 2000:1 as the ratios of the total number of users and the number of user groups , and 100:1, 500:1, 1000:1 as the ratios of the total number of jobs and the number of job groups for our experiments respectively, as shown in Figure LABEL:fig:para_user and LABEL:fig:para_job.", | |
"We can easily observe that our model achieves best when 1000:1 and 500:1.", | |
"The order of Chevbyshev approximation The order of Chevbyshev approximation greatly impacts the performance of hypergraph wavelet neural networks.", | |
"To find the best order, we test our model with and , and the results are shown in Figure LABEL:fig:para_p.", | |
"We can see that the performance of the model remains constant when is greater than or equal to 3.", | |
"Notice that as increases, the computational overhead of the model will also increase, so we choose as the hyperparameter of our model.", | |
"The average length of a session The length of the session is another hyperparameter that affects the performance of the model.", | |
"In Figure LABEL:fig:session_len_a, we can see that the average length of each session is 19-37 on average, and such short behavioral sequences in job recommendations (In job recommendations, the average number of interactions for users to find a suitable job is more than 80 times) are easily interfered with by noisy interactions.", | |
"Therefore, we further compared our proposed framework with the top-2 baselines under different session length, as illustrated in Figure LABEL:fig:session_len_b.", | |
"It can be seen that when the session length is relatively short, noise has a huge negative impact on the accuracy of all models.", | |
"However, as the session length decreases, our framework is more robust than the other two methods and can better resist noise interference.", | |
"The details of these baselines are as follows: BasicMF (Koren et al., 2009 ): A model that combines matrix factorization with a Multilayer Perceptron (MLP) for recommendations.", | |
"ItemKNN (Wang et al., 2006 ): A recommender that utilizes item-based collaborative filtering.", | |
"PureSVD (Cremonesi et al., 2010 ): An approach that applies Singular Value Decomposition for recommendation tasks.", | |
"SLIM (Ning and Karypis, 2011 ): A recommendation method known as the Sparse Linear Method.", | |
"DAE (Wu et al., 2016 ): Stands for Collaborative Denoising Auto-Encoder, used in recommendation systems.", | |
"MultVAE (Liang et al., 2018 ): A model extending Variational Autoencoders to collaborative filtering for implicit feedback.", | |
"EASE (Steck, 2019 ): A recommendation technique called Embarrassingly Shallow Autoencoders for Sparse Data.", | |
"P3a (Cooper et al., 2014 ): A method that uses ordering rules from random walks on a user-item graph.", | |
"RP3b (Paudel et al., 2016 ): A recommender that re-ranks items based on 3-hop random walk transition probabilities.", | |
"NGCF (Wang et al., 2019 ): Employs graph embedding propagation layers to generate user/item representations.", | |
"LightGCN (He et al., 2020 ): Utilizes neighborhood information in the user-item interaction graph.", | |
"SLRec (Yao et al., 2021 ): A method using contrastive learning among node features.", | |
"SGL (Wu et al., 2021 ): Enhances LightGCN with self-supervised contrastive learning.", | |
"GCCF (Chen et al., 2020 ): A multi-layer graph convolutional network for recommendation.", | |
"NCL (Lin et al., 2022 ): Enhances recommendation models with neighborhood-enriched contrastive learning.", | |
"DirectAU (Wang et al., 2022 ): Focuses on the quality of representation based on alignment and uniformity.", | |
"HG-GNN (Pang et al., 2022 ): Constructs a heterogeneous graph with both user nodes and item nodes and uses a graph neural network to learn the embedding of nodes as a potential representation of users or items.", | |
"A-PGNN (Zhang et al., 2020 ): Uses GNN to extract session representations for intra-session interactions and uses an attention mechanism to learn features between sessions.", | |
"AdaGCL (Jiang et al., 2023 ): Combines a graph generator and a graph denoising model for contrastive views.", | |
"MvDGAE (Zheng et al., 2021 ): Stands for Multi-view Denoising Graph AutoEncoders.", | |
"STAMP (Liu et al., 2018 ): A model based on the attention mechanism to model user behavior sequence data.", | |
"GRU4Rec (Hidasi et al., 2016 ): Utilizes Gated Recurrent Units for session-based recommendations.", | |
"BERT4Rec (Sun et al., 2019 ): A model for the sequence-based recommendation that handles long user behavior sequences.", | |
"CL4Rec (Xie et al., 2022 ): An improved version of BERT4Rec with locality-sensitive hashing for faster item retrieval.", | |
"CoScRec (Liu et al., 2021 ): It explores an innovative recommendation approach that enhances sequential recommendation systems through robust data augmentation and contrastive self-supervised learning techniques.", | |
"TiCoSeRec (Dang et al., 2023 ): A method based on CoSeRec, utilizing data augmentation algorithms for sequence recommendation improvement.", | |
"The hyperparameter, , also has a critical impact on experimental results.", | |
"We set and to conduct experiments respectively.", | |
"The experimental results are shown in Table C.2 .", | |
"It can be seen our model performs well under all settings.", | |
"[ tabular=cccc, table head= , late after line= , late after last line= ]data/exp_para_k.txt\\csvlinetotablerow Expect Expect Click Click Rec.", | |
"Beyond its effectiveness in performance, BISTRO also boasts considerable interpretability.", | |
"To demonstrate how the framework mitigates both the job preference drift and data noise problems, we present a real-life scenario to illustrate the logic behind the suggestions made by BISTRO, as shown in Figure 5 .", | |
"In this figure, jobs with IDs ***872 and ***994 are two job positions that are newly posted in the online recruitment system, while IDs ***265 and ***523 are two job positions that a large number of users interact with frequently.", | |
"Among them, ***872 and ***265, as well as ***994 and ***523, have similar occupational demand descriptions respectively.", | |
"Also, the user with ID ***175 shared a similar resume with user ID ***479 before ***175 modified the resume, and after his resume was changed, ***175 had a similar content with user ***013.", | |
"Recommendations in this scenario can be divided into three examples: Example 1 (Recommendation for a dynamically changing user) Consider the user represented by ID ***175, BISTRO addresses this challenge by deploying content-based analysis.", | |
"The framework utilizes the user s social network and a set of resume attributes collected to create a composite feature profile to identify users with similar tastes.", | |
"Subsequently, it recommends a job with ID ***523 favored by a like-minded user with ID ***013 to him.", | |
"Example 2 (Recommendation for a new job) A newly posted job with ID ***872 lacks any user interaction data, complicating the generation of a meaningful representation for it.", | |
"BISTRO, however, overcomes this by incorporating auxiliary information such as skill requirements and working experience, and then associated tags to locate similar content.", | |
"By leveraging this approach combined with the user s expectations, BISTRO acquires a rich and informative embedding for the job, enabling it to recommend the job to users who have shown an interest in comparable jobs.", | |
"Example 3 (Recommend a new job to a dynamically changing user) Combining both two situations illustrated above, BISTRO deals with this complex challenge by utilizing a wavelet graph denoising filter and graph representation method.", | |
"In this way, it can recommend the latest jobs with similar job content to users with the same real-time needs as well as similar user content characteristics.", | |
"[ tabular=ccccccc, table head= Shenzhen Shanghai Beijing H@10 M@10 H@10 M@10 H@10 M@10 , late after line= , late after last line= ]data/app_case.txt Bold indicates the statistically significant improvements (i.e., two-sided t-test with p ¡ 0.05) over the best baseline (underlined).", | |
"For all metrics: the higher, the better.", | |
"In addition, we also compare the proposed framework, BISTRO, with multiple job state-of-the-art recommender systems, i.e., InEXIT (Shao et al., 2023 ), DGMN (Bian et al., 2019 ), and APJFMF (Jian et al., 2024 ).", | |
"The result can be found in Table C.3 .", | |
"We can see that our framework acheves the best among all baselines, which verify the effectiveness of our method.", | |
"" | |
], | |
"target_context_ids": [ | |
2, | |
6, | |
7 | |
], | |
"selected_paragraphs": [ | |
"[paragraph id = 2] The comparison results are shown in Table 5.4 .", | |
"[paragraph id = 6] As shown in Table 5.4 , the density of the graph is doubled when we add all types of hyperedges and the experimental results have also been improved due to the optimization of the structure in the graph.", | |
"[paragraph id = 7] [ tabular=cccc, table head= Density H@10 M@10 , late after line= , late after last line= ]data/exp_hyperedge.txt\\csvlinetotablerow Non-Noise Add 10% Noise Add 20% Noise Add 50% Noise The validity of hypergraph wavelet filter In BISTRO, we design a novel hypergraph wavelet learning method." | |
], | |
"table_html": "<figure class=\"ltx_table ltx_figure_panel\" id=\"S5.SS4.2.tab1\">\n<figcaption class=\"ltx_caption\"><span class=\"ltx_tag ltx_tag_table\"><span class=\"ltx_text\" id=\"S5.SS4.2.tab1.1.1.1\" style=\"font-size:90%;\">Table 5</span>. </span><span class=\"ltx_text\" id=\"S5.SS4.2.tab1.2.2\" style=\"font-size:90%;\">Results under different graph constructions.</span></figcaption><div class=\"ltx_flex_figure\">\n<div class=\"ltx_flex_cell ltx_flex_size_1\"><span class=\"ltx_ERROR ltx_figure_panel undefined\" id=\"S5.SS4.2.tab1.3\">\\csvreader</span></div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"S5.SS4.2.tab1.4\">[\ntabular=cccc,\ntable head=</p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"S5.SS4.2.tab1.5\"><span class=\"ltx_text ltx_font_bold\" id=\"S5.SS4.2.tab1.5.1\">Density H@10 M@10</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"S5.SS4.2.tab1.6\"><span class=\"ltx_text ltx_font_bold\" id=\"S5.SS4.2.tab1.6.1\">,\nlate after line=\n<br class=\"ltx_break\"/>,\nlate after last line= \n<br class=\"ltx_break\"/>]data/exp_hyperedge.txt<span class=\"ltx_ERROR undefined\" id=\"S5.SS4.2.tab1.6.1.1\">\\csvlinetotablerow</span>\n</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<figure class=\"ltx_figure ltx_figure_panel\" id=\"S5.F4.sf2\">\n</figure>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<figure class=\"ltx_figure ltx_figure_panel\" id=\"S4.F4\">\n<div class=\"ltx_flex_figure\">\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<div class=\"ltx_inline-block ltx_figure_panel ltx_align_center ltx_transformed_outer\" id=\"S4.F4.1\" style=\"width:433.6pt;height:468.6pt;vertical-align:-0.8pt;\"><span class=\"ltx_transformed_inner\" style=\"transform:translate(-63.0pt,68.0pt) scale(0.774756653865617,0.774756653865617) ;\"><svg class=\"ltx_picture\" height=\"835.55\" id=\"S4.F4.1.pic1\" overflow=\"visible\" version=\"1.1\" width=\"765.21\"><g fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\" transform=\"translate(0,835.55) matrix(1 0 0 -1 0 0) translate(1.54,0) translate(0,-14.16)\"><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 49.48 620.16)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/0_0.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 285.7 620.16)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/1_0.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 521.92 620.16)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.3.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/2_0.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 49.48 435.12)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.4.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/0_10.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 285.7 435.12)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.5.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/1_10.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 521.92 435.12)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.6.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/2_10.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 49.48 254.02)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.7.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/0_20.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 285.7 254.02)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.8.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/1_20.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 521.92 254.02)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.9.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/2_20.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 49.48 68.98)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.10.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/0_50.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 285.7 68.98)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.11.11.11.11.11.11.11.11.11.11.11.11.11.11.11.11.11.11.11.11.11.11.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/1_50.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 521.92 68.98)\"><foreignobject height=\"177\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"216\"><img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_square\" height=\"177\" id=\"S4.F4.1.pic1.12.12.12.12.12.12.12.12.12.12.12.12.12.12.12.12.12.12.12.12.12.1.1.1.1.1.1.1.1.1.1.g1\" src=\"extracted/5686780/fig/ball/2_50.png\" width=\"216\"/></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 78.45 812.68)\"><foreignobject height=\"28.18\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"158.07\"><span class=\"ltx_text\" id=\"S4.F4.1.pic1.13.13.13.1.1\" style=\"font-size:298%;\">BISTRO</span></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 188.85 816.46)\"><foreignobject height=\"36.65\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"408.56\"><span class=\"ltx_text\" id=\"S4.F4.1.pic1.14.14.14.1.1\" style=\"font-size:298%;\">General Spectral GNN</span></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 500.78 812.45)\"><foreignobject height=\"28.63\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"258.29\"><span class=\"ltx_text\" id=\"S4.F4.1.pic1.15.15.15.1.1\" style=\"font-size:298%;\">Random Walk</span></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 5.6 615.77)\"><foreignobject height=\"185.78\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"28.18\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"S4.F4.1.pic1.16.16.16.1.1\" style=\"width:20.4pt;height:134.3pt;vertical-align:-0.0pt;\"><span class=\"ltx_transformed_inner\" style=\"width:134.3pt;transform:translate(-56.95pt,-56.95pt) rotate(-90deg) ;\">\n<p class=\"ltx_p\" id=\"S4.F4.1.pic1.16.16.16.1.1.1\"><span class=\"ltx_text\" id=\"S4.F4.1.pic1.16.16.16.1.1.1.1\" style=\"font-size:298%;\">Non-Noise</span></p>\n</span></div></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 3.08 384.91)\"><foreignobject height=\"277.42\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"33.22\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"S4.F4.1.pic1.17.17.17.1.1\" style=\"width:24.0pt;height:200.5pt;vertical-align:-0.0pt;\"><span class=\"ltx_transformed_inner\" style=\"width:200.5pt;transform:translate(-88.24pt,-87.41pt) rotate(-90deg) ;\">\n<p class=\"ltx_p\" id=\"S4.F4.1.pic1.17.17.17.1.1.1\"><span class=\"ltx_text\" id=\"S4.F4.1.pic1.17.17.17.1.1.1.1\" style=\"font-size:298%;\">Add 10% Noise</span></p>\n</span></div></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 3.08 203.81)\"><foreignobject height=\"277.42\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"33.22\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"S4.F4.1.pic1.18.18.18.1.1\" style=\"width:24.0pt;height:200.5pt;vertical-align:-0.0pt;\"><span class=\"ltx_transformed_inner\" style=\"width:200.5pt;transform:translate(-88.24pt,-87.41pt) rotate(-90deg) ;\">\n<p class=\"ltx_p\" id=\"S4.F4.1.pic1.18.18.18.1.1.1\"><span class=\"ltx_text\" id=\"S4.F4.1.pic1.18.18.18.1.1.1.1\" style=\"font-size:298%;\">Add 20% Noise</span></p>\n</span></div></foreignobject></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 3.08 18.77)\"><foreignobject height=\"277.42\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"33.22\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"S4.F4.1.pic1.19.19.19.1.1\" style=\"width:24.0pt;height:200.5pt;vertical-align:-0.0pt;\"><span class=\"ltx_transformed_inner\" style=\"width:200.5pt;transform:translate(-88.24pt,-87.41pt) rotate(-90deg) ;\">\n<p class=\"ltx_p\" id=\"S4.F4.1.pic1.19.19.19.1.1.1\"><span class=\"ltx_text\" id=\"S4.F4.1.pic1.19.19.19.1.1.1.1\" style=\"font-size:298%;\">Add 50% Noise</span></p>\n</span></div></foreignobject></g></g></svg>\n</span></div>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel ltx_align_center\" id=\"S4.F4.2\"><span class=\"ltx_text ltx_inline-block\" id=\"S4.F4.2.1\" style=\"width:433.6pt;\">\n<img alt=\"Refer to caption\" class=\"ltx_graphics ltx_img_landscape\" height=\"119\" id=\"S4.F4.2.1.g1\" src=\"x1.png\" width=\"457\"/>\n</span></p>\n</div>\n</div>\n<figcaption class=\"ltx_caption ltx_centering\"><span class=\"ltx_tag ltx_tag_figure\"><span class=\"ltx_text\" id=\"S4.F4.4.1.1\" style=\"font-size:90%;\">Figure 4</span>. </span><span class=\"ltx_text\" id=\"S4.F4.5.2\" style=\"font-size:90%;\">Results of denoising performance.\n</span></figcaption>\n</figure>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<figure class=\"ltx_figure ltx_figure_panel\" id=\"S5.F4.sf3\">\n</figure>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"S5.SS4.2.tab1.7\"><span class=\"ltx_text ltx_font_bold\" id=\"S5.SS4.2.tab1.7.1\">The validity of hypergraph wavelet filter</span> In BISTRO, we design a novel hypergraph wavelet learning method.\nIn this learning method, a wavelet filter is deployed for data denoising as well as fine-grained job preference feature extraction.\nAs shown in Figure <span class=\"ltx_ref ltx_missing_label ltx_ref_self\">LABEL:fig:exp_filter_line</span>, the curves illustrate the results of three models, which have different filtering settings, under different percentages of noise in the data.\nWe can also visualize from it that our method, BISTRO, has the smoothest decrease in model performance as the proportion of noise in the data increases.</p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"S5.SS4.2.tab1.8\">To vividly show the denoising capability of the proposed hypergraph wavelet filter, we randomly select a user who is active in a week, filter the 50 most recent interactions from three job categories, and construct an interaction graph.\nIn this graph, each node represents a job the user has engaged with, interconnected by grey dotted lines, while the interaction sequence of the user is depicted with grey edges.\nOn this basis, we introduce noisy jobs (marked with orange crosses) and their corresponding interactions (denoted by orange edges and dotted lines) to mimic the effect of a user accidentally clicking on unrelated job types.\nGiven that each model generates job preference representations for diverse jobs, we visualize the connections between the user and jobs, as well as the relationships among jobs themselves, as shown in Figure <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#S4.F4\" title=\"Figure 4 ‣ 5.4. Ablation Study (RQ2, RQ3) ‣ 5.3. Overall Performance (RQ1) ‣ 5.2. Experimental Settings ‣ 5.1. Dataset ‣ 5. Experiments ‣ Adapting Job Recommendations to User Preference Drift with Behavioral-Semantic Fusion Learning\">4 ###reference_### ###reference_### ###reference_### ###reference_###</a>.\nWe eliminate edges whose cosine similarity between job representation pairs fell below a uniform threshold and remove links between isolated jobs and the user.\nConsequently, a graph with more orange lines indicates lower model performance.\nNotably, as data noise levels escalated, the comparative models demonstrated diminished noise filtering effectiveness relative to our proposed approach. Specifically, the random walk-based method significantly underperformed compared to the spectral GCN method, primarily due to the ability of spectral graph neural networks to filter out irrelevant interaction features. Furthermore, our approach employs a wavelet kernel to create a set of sub-filters, adeptly denoising by dynamically selecting appropriate filters for the user’s evolving characteristics.</p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<section class=\"ltx_subsection ltx_figure_panel\" id=\"S5.SS5\">\n<h3 class=\"ltx_title ltx_title_subsection\">\n<span class=\"ltx_tag ltx_tag_subsection\">5.5. </span>Parametric Study <em class=\"ltx_emph ltx_font_italic\" id=\"S5.SS5.1.1\">(RQ4)</em>\n</h3>\n<div class=\"ltx_para ltx_noindent\" id=\"S5.SS5.p1\">\n<p class=\"ltx_p\" id=\"S5.SS5.p1.4\"><span class=\"ltx_text ltx_font_bold\" id=\"S5.SS5.p1.4.1\">The size of user (job) groups</span> The size of user and job groups are two hyperparameters that need to be predefined.\nTherefore, we choose 500:1, 1000:1, and 2000:1 as the ratios of the total number of users and the number of user groups , and 100:1, 500:1, 1000:1 as the ratios of the total number of jobs and the number of job groups for our experiments respectively, as shown in Figure <span class=\"ltx_ref ltx_missing_label ltx_ref_self\">LABEL:fig:para_user</span> and <span class=\"ltx_ref ltx_missing_label ltx_ref_self\">LABEL:fig:para_job</span>.\nWe can easily observe that our model achieves best when 1000:1 and 500:1.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"S5.SS5.p2\">\n<p class=\"ltx_p\" id=\"S5.SS5.p2.5\"><span class=\"ltx_text ltx_font_bold\" id=\"S5.SS5.p2.5.1\">The order of Chevbyshev approximation</span> The order of Chevbyshev approximation greatly impacts the performance of hypergraph wavelet neural networks.\nTo find the best order, we test our model with and , and the results are shown in Figure <span class=\"ltx_ref ltx_missing_label ltx_ref_self\">LABEL:fig:para_p</span>.\nWe can see that the performance of the model remains constant when is greater than or equal to 3. Notice that as increases, the computational overhead of the model will also increase, so we choose as the hyperparameter of our model.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"S5.SS5.p3\">\n<p class=\"ltx_p\" id=\"S5.SS5.p3.1\"><span class=\"ltx_text ltx_font_bold\" id=\"S5.SS5.p3.1.1\">The average length of a session</span> The length of the session is another hyperparameter that affects the performance of the model.\nIn Figure <span class=\"ltx_ref ltx_missing_label ltx_ref_self\">LABEL:fig:session_len_a</span>, we can see that the average length of each session is 19-37 on average, and such short behavioral sequences in job recommendations (In job recommendations, the average number of interactions for users to find a suitable job is more than 80 times) are easily interfered with by noisy interactions.\nTherefore, we further compared our proposed framework with the top-2 baselines under different session length, as illustrated in Figure <span class=\"ltx_ref ltx_missing_label ltx_ref_self\">LABEL:fig:session_len_b</span>.\nIt can be seen that when the session length is relatively short, noise has a huge negative impact on the accuracy of all models. However, as the session length decreases, our framework is more robust than the other two methods and can better resist noise interference.</p>\n</div>\n<figure class=\"ltx_figure\" id=\"S5.F4.sf4\">\n</figure>\n<section class=\"ltx_section\" id=\"S6\">\n<h2 class=\"ltx_title ltx_title_section\">\n<span class=\"ltx_tag ltx_tag_section\">6. </span>Conclusion</h2>\n<div class=\"ltx_para\" id=\"S6.p1\">\n<p class=\"ltx_p\" id=\"S6.p1.1\">This study introduces BISTRO, an innovative framework designed to navigate the challenges of job preference drift and the subsequent data noise.\nThe framework is structured around three modules: a coarse-grained semantic clustering module, a fine-grained job preference extraction module, and a personalized top- job recommendation module.\nSpecifically, a hypergraph is constructed to deal with the preference drift issue and a novel hypergraph wavelet learning method is proposed to filter the noise in interactions when extracting job preferences.\nThe effectiveness and clarity of BISTRO are validated through experiments conducted with both offline and online environments.\nLooking ahead, we aim to continue refining BISTRO to enhance its applicability in broader contexts, particularly in scenarios characterized by anomalous data.</p>\n</div>\n<section class=\"ltx_bibliography\" id=\"bib\">\n<h2 class=\"ltx_title ltx_title_bibliography\">References</h2>\n<ul class=\"ltx_biblist\">\n<li class=\"ltx_bibitem\" id=\"bib.bib1\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">(1)</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib2\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Bian et al<span class=\"ltx_text\" id=\"bib.bib2.2.2.1\">.</span> (2019)</span>\n<span class=\"ltx_bibblock\">\nShuqing Bian, Wayne Xin Zhao, Yang Song, Tao Zhang, and Ji-Rong Wen. 2019.\n\n</span>\n<span class=\"ltx_bibblock\">Domain adaptation for person-job fit with transferable deep global match network. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib2.3.1\">Proc. of EMNLP</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib3\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Boyd (2001)</span>\n<span class=\"ltx_bibblock\">\nJohn P Boyd. 2001.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib3.1.1\">Chebyshev and Fourier spectral methods</em>.\n\n</span>\n<span class=\"ltx_bibblock\">Courier Corporation.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib4\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Bruna et al<span class=\"ltx_text\" id=\"bib.bib4.2.2.1\">.</span> (2014)</span>\n<span class=\"ltx_bibblock\">\nJoan Bruna, Wojciech Zaremba, Arthur Szlam, and Yann LeCun. 2014.\n\n</span>\n<span class=\"ltx_bibblock\">Spectral Networks and Locally Connected Networks on Graphs. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib4.3.1\">Proc. of ICLR</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib5\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Bu et al<span class=\"ltx_text\" id=\"bib.bib5.2.2.1\">.</span> (2010)</span>\n<span class=\"ltx_bibblock\">\nJiajun Bu, Shulong Tan, Chun Chen, Can Wang, Hao Wu, Lijun Zhang, and Xiaofei He. 2010.\n\n</span>\n<span class=\"ltx_bibblock\">Music recommendation by unified hypergraph: combining social media information and music content. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib5.3.1\">Proc. of ACM MM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib6\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Chen et al<span class=\"ltx_text\" id=\"bib.bib6.2.2.1\">.</span> (2020)</span>\n<span class=\"ltx_bibblock\">\nLei Chen, Le Wu, Richang Hong, Kun Zhang, and Meng Wang. 2020.\n\n</span>\n<span class=\"ltx_bibblock\">Revisiting graph based collaborative filtering: A linear residual graph convolutional network approach. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib6.3.1\">Proc. of AAAI</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib7\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Cooper et al<span class=\"ltx_text\" id=\"bib.bib7.2.2.1\">.</span> (2014)</span>\n<span class=\"ltx_bibblock\">\nColin Cooper, Sang Hyuk Lee, Tomasz Radzik, and Yiannis Siantos. 2014.\n\n</span>\n<span class=\"ltx_bibblock\">Random walks in recommender systems: exact computation and simulations. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib7.3.1\">Proc. of WWW</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib8\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Cremonesi et al<span class=\"ltx_text\" id=\"bib.bib8.2.2.1\">.</span> (2010)</span>\n<span class=\"ltx_bibblock\">\nPaolo Cremonesi, Yehuda Koren, and Roberto Turrin. 2010.\n\n</span>\n<span class=\"ltx_bibblock\">Performance of recommender algorithms on top-n recommendation tasks. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib8.3.1\">Proceedings of the fourth ACM conference on Recommender systems</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib9\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Dai et al<span class=\"ltx_text\" id=\"bib.bib9.2.2.1\">.</span> (2021)</span>\n<span class=\"ltx_bibblock\">\nEnyan Dai, Charu Aggarwal, and Suhang Wang. 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Nrgnn: Learning a label noise resistant graph neural network on sparsely and noisily labeled graphs. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib9.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib10\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Dang et al<span class=\"ltx_text\" id=\"bib.bib10.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nYizhou Dang, Enneng Yang, Guibing Guo, Linying Jiang, Xingwei Wang, Xiaoxiao Xu, Qinghui Sun, and Hong Liu. 2023.\n\n</span>\n<span class=\"ltx_bibblock\">Uniform Sequence Better: Time Interval Aware Data Augmentation for Sequential Recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib10.3.1\">Proc. of AAAI</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib11\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">He et al<span class=\"ltx_text\" id=\"bib.bib11.2.2.1\">.</span> (2020)</span>\n<span class=\"ltx_bibblock\">\nXiangnan He, Kuan Deng, Xiang Wang, Yan Li, Yongdong Zhang, and Meng Wang. 2020.\n\n</span>\n<span class=\"ltx_bibblock\">Lightgcn: Simplifying and powering graph convolution network for recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib11.3.1\">Proc. of SIGIR</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib12\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Hidasi et al<span class=\"ltx_text\" id=\"bib.bib12.2.2.1\">.</span> (2016)</span>\n<span class=\"ltx_bibblock\">\nBalázs Hidasi, Alexandros Karatzoglou, Linas Baltrunas, and Domonkos Tikk. 2016.\n\n</span>\n<span class=\"ltx_bibblock\">Session-based Recommendations with Recurrent Neural Networks. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib12.3.1\">Proc. of ICLR</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib13\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Hu et al<span class=\"ltx_text\" id=\"bib.bib13.2.2.1\">.</span> (2018)</span>\n<span class=\"ltx_bibblock\">\nGuangneng Hu, Yu Zhang, and Qiang Yang. 2018.\n\n</span>\n<span class=\"ltx_bibblock\">CoNet: Collaborative Cross Networks for Cross-Domain Recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib13.3.1\">Proc. of CIKM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib14\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Hu et al<span class=\"ltx_text\" id=\"bib.bib14.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nXiao Hu, Yuan Cheng, Zhi Zheng, Yue Wang, Xinxin Chi, and Hengshu Zhu. 2023.\n\n</span>\n<span class=\"ltx_bibblock\">BOSS: A Bilateral Occupational-Suitability-Aware Recommender System for Online Recruitment. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib14.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib15\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Huang (2021)</span>\n<span class=\"ltx_bibblock\">\nChao Huang. 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Recent Advances in Heterogeneous Relation Learning for Recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib15.1.1\">Proc. of IJCAI</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib16\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Huang et al<span class=\"ltx_text\" id=\"bib.bib16.2.2.1\">.</span> (2021)</span>\n<span class=\"ltx_bibblock\">\nYuzhen Huang, Xiaohan Wei, Xing Wang, Jiyan Yang, Bor-Yiing Su, Shivam Bharuka, Dhruv Choudhary, Zewei Jiang, Hai Zheng, and Jack Langman. 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Hierarchical training: Scaling deep recommendation models on large cpu clusters. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib16.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib17\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Insights ([n. d.])</span>\n<span class=\"ltx_bibblock\">\nFortune Business Insights. [n. d.].\n\n</span>\n<span class=\"ltx_bibblock\">Online Recruitment Technology Market: Forecast Report 2030.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib18\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Jian et al<span class=\"ltx_text\" id=\"bib.bib18.2.2.1\">.</span> (2024)</span>\n<span class=\"ltx_bibblock\">\nLing Jian, Chongzhi Rao, and Xiao Gu. 2024.\n\n</span>\n<span class=\"ltx_bibblock\">Your Profile Reveals Your Traits in Talent Market: An Enhanced Person-Job Fit Representation Learning.\n\n</span>\n<span class=\"ltx_bibblock\">(2024).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib19\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Jiang et al<span class=\"ltx_text\" id=\"bib.bib19.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nYangqin Jiang, Chao Huang, and Lianghao Huang. 2023.\n\n</span>\n<span class=\"ltx_bibblock\">Adaptive graph contrastive learning for recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib19.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib20\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Kannout et al<span class=\"ltx_text\" id=\"bib.bib20.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nEyad Kannout, Michal Grodzki, and Marek Grzegorowski. 2023.\n\n</span>\n<span class=\"ltx_bibblock\">Towards addressing item cold-start problem in collaborative filtering by embedding agglomerative clustering and FP-growth into the recommendation system.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib20.3.1\">Comput. Sci. Inf. Syst.</em> (2023).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib21\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Koren et al<span class=\"ltx_text\" id=\"bib.bib21.2.2.1\">.</span> (2009)</span>\n<span class=\"ltx_bibblock\">\nYehuda Koren, Robert Bell, and Chris Volinsky. 2009.\n\n</span>\n<span class=\"ltx_bibblock\">Matrix factorization techniques for recommender systems.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib21.3.1\">Computer</em> (2009).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib22\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Li et al<span class=\"ltx_text\" id=\"bib.bib22.2.2.1\">.</span> (2018)</span>\n<span class=\"ltx_bibblock\">\nCheng-Te Li, Chia-Tai Hsu, and Man-Kwan Shan. 2018.\n\n</span>\n<span class=\"ltx_bibblock\">A Cross-Domain Recommendation Mechanism for Cold-Start Users Based on Partial Least Squares Regression.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib22.3.1\">ACM Trans. Intell. Syst. Technol.</em> (2018).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib23\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Li and Li (2013)</span>\n<span class=\"ltx_bibblock\">\nLei Li and Tao Li. 2013.\n\n</span>\n<span class=\"ltx_bibblock\">News recommendation via hypergraph learning: encapsulation of user behavior and news content. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib23.1.1\">Proc. of WSDM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib24\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Li et al<span class=\"ltx_text\" id=\"bib.bib24.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nMuyang Li, Zijian Zhang, Xiangyu Zhao, Wanyu Wang, Minghao Zhao, Runze Wu, and Ruocheng Guo. 2023.\n\n</span>\n<span class=\"ltx_bibblock\">Automlp: Automated mlp for sequential recommendations. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib24.3.1\">Proceedings of the ACM Web Conference 2023</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib25\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Li et al<span class=\"ltx_text\" id=\"bib.bib25.2.2.1\">.</span> (2022)</span>\n<span class=\"ltx_bibblock\">\nXinhang Li, Zhaopeng Qiu, Xiangyu Zhao, Zihao Wang, Yong Zhang, Chunxiao Xing, and Xian Wu. 2022.\n\n</span>\n<span class=\"ltx_bibblock\">Gromov-wasserstein guided representation learning for cross-domain recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib25.3.1\">Proc. of CIKM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib26\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Liang et al<span class=\"ltx_text\" id=\"bib.bib26.2.2.1\">.</span> (2018)</span>\n<span class=\"ltx_bibblock\">\nDawen Liang, Rahul G Krishnan, Matthew D Hoffman, and Tony Jebara. 2018.\n\n</span>\n<span class=\"ltx_bibblock\">Variational autoencoders for collaborative filtering. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib26.3.1\">Proc. of WWW</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib27\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Liao et al<span class=\"ltx_text\" id=\"bib.bib27.2.2.1\">.</span> (2021)</span>\n<span class=\"ltx_bibblock\">\nShu-Hsien Liao, Retno Widowati, and Yu-Chieh Hsieh. 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Investigating online social media users’ behaviors for social commerce recommendations.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib27.3.1\">Technology in Society</em> (2021).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib28\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Lin et al<span class=\"ltx_text\" id=\"bib.bib28.2.2.1\">.</span> (2017)</span>\n<span class=\"ltx_bibblock\">\nHao Lin, Hengshu Zhu, Yuan Zuo, Chen Zhu, Junjie Wu, and Hui Xiong. 2017.\n\n</span>\n<span class=\"ltx_bibblock\">Collaborative company profiling: Insights from an employee’s perspective. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib28.3.1\">Proc. of AAAI</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib29\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Lin et al<span class=\"ltx_text\" id=\"bib.bib29.2.2.1\">.</span> (2022)</span>\n<span class=\"ltx_bibblock\">\nZihan Lin, Changxin Tian, Yupeng Hou, and Wayne Xin Zhao. 2022.\n\n</span>\n<span class=\"ltx_bibblock\">Improving graph collaborative filtering with neighborhood-enriched contrastive learning. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib29.3.1\">Proceedings of the ACM Web Conference 2022</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib30\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Liu et al<span class=\"ltx_text\" id=\"bib.bib30.2.2.1\">.</span> (2022)</span>\n<span class=\"ltx_bibblock\">\nBulou Liu, Bing Bai, Weibang Xie, Yiwen Guo, and Hao Chen. 2022.\n\n</span>\n<span class=\"ltx_bibblock\">Task-optimized user clustering based on mobile app usage for cold-start recommendations. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib30.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib31\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Liu et al<span class=\"ltx_text\" id=\"bib.bib31.2.2.1\">.</span> (2023b)</span>\n<span class=\"ltx_bibblock\">\nQidong Liu, Fan Yan, Xiangyu Zhao, Zhaocheng Du, Huifeng Guo, Ruiming Tang, and Feng Tian. 2023b.\n\n</span>\n<span class=\"ltx_bibblock\">Diffusion augmentation for sequential recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib31.3.1\">Proc. of CIKM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib32\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Liu et al<span class=\"ltx_text\" id=\"bib.bib32.2.2.1\">.</span> (2018)</span>\n<span class=\"ltx_bibblock\">\nQiao Liu, Yifu Zeng, Refuoe Mokhosi, and Haibin Zhang. 2018.\n\n</span>\n<span class=\"ltx_bibblock\">STAMP: Short-Term Attention/Memory Priority Model for Session-based Recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib32.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib33\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Liu et al<span class=\"ltx_text\" id=\"bib.bib33.2.2.1\">.</span> (2021)</span>\n<span class=\"ltx_bibblock\">\nZhiwei Liu, Yongjun Chen, Jia Li, Philip S. Yu, Julian J. McAuley, and Caiming Xiong. 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Contrastive Self-supervised Sequential Recommendation with Robust Augmentation.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib33.3.1\">CoRR</em> (2021).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib34\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Liu et al<span class=\"ltx_text\" id=\"bib.bib34.2.2.1\">.</span> (2023a)</span>\n<span class=\"ltx_bibblock\">\nZiru Liu, Jiejie Tian, Qingpeng Cai, Xiangyu Zhao, Jingtong Gao, Shuchang Liu, Dayou Chen, Tonghao He, Dong Zheng, Peng Jiang, et al<span class=\"ltx_text\" id=\"bib.bib34.3.1\">.</span> 2023a.\n\n</span>\n<span class=\"ltx_bibblock\">Multi-task recommendations with reinforcement learning. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib34.4.1\">Proceedings of the ACM Web Conference 2023</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib35\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Ning and Karypis (2011)</span>\n<span class=\"ltx_bibblock\">\nXia Ning and George Karypis. 2011.\n\n</span>\n<span class=\"ltx_bibblock\">Slim: Sparse linear methods for top-n recommender systems. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib35.1.1\">Proc. of ICDM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib36\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Pang et al<span class=\"ltx_text\" id=\"bib.bib36.2.2.1\">.</span> (2022)</span>\n<span class=\"ltx_bibblock\">\nYitong Pang, Lingfei Wu, Qi Shen, Yiming Zhang, Zhihua Wei, Fangli Xu, Ethan Chang, Bo Long, and Jian Pei. 2022.\n\n</span>\n<span class=\"ltx_bibblock\">Heterogeneous global graph neural networks for personalized session-based recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib36.3.1\">Proc. of WSDM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib37\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Paudel et al<span class=\"ltx_text\" id=\"bib.bib37.2.2.1\">.</span> (2016)</span>\n<span class=\"ltx_bibblock\">\nBibek Paudel, Fabian Christoffel, Chris Newell, and Abraham Bernstein. 2016.\n\n</span>\n<span class=\"ltx_bibblock\">Updatable, accurate, diverse, and scalable recommendations for interactive applications.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib37.3.1\">ACM Transactions on Interactive Intelligent Systems (TiiS)</em> (2016).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib38\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Qin et al<span class=\"ltx_text\" id=\"bib.bib38.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nChuan Qin, Le Zhang, Rui Zha, Dazhong Shen, Qi Zhang, Ying Sun, Chen Zhu, Hengshu Zhu, and Hui Xiong. 2023.\n\n</span>\n<span class=\"ltx_bibblock\">A comprehensive survey of artificial intelligence techniques for talent analytics.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib38.3.1\">arXiv preprint arXiv:2307.03195</em> (2023).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib39\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Reusens et al<span class=\"ltx_text\" id=\"bib.bib39.2.2.1\">.</span> (2018)</span>\n<span class=\"ltx_bibblock\">\nMichael Reusens, Wilfried Lemahieu, Bart Baesens, and Luc Sels. 2018.\n\n</span>\n<span class=\"ltx_bibblock\">Evaluating recommendation and search in the labor market.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib39.3.1\">Knowledge-Based Systems</em> (2018).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib40\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Shao et al<span class=\"ltx_text\" id=\"bib.bib40.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nTaihua Shao, Chengyu Song, Jianming Zheng, Fei Cai, Honghui Chen, et al<span class=\"ltx_text\" id=\"bib.bib40.3.1\">.</span> 2023.\n\n</span>\n<span class=\"ltx_bibblock\">Exploring internal and external interactions for semi-structured multivariate attributes in job-resume matching.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib40.4.1\">International Journal of Intelligent Systems</em> (2023).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib41\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Sloan and Smith (1980)</span>\n<span class=\"ltx_bibblock\">\nIan H Sloan and William E Smith. 1980.\n\n</span>\n<span class=\"ltx_bibblock\">Product integration with the Clenshaw-Curtis points: implementation and error estimates.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib41.1.1\">Numer. Math.</em> (1980).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib42\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Steck (2019)</span>\n<span class=\"ltx_bibblock\">\nHarald Steck. 2019.\n\n</span>\n<span class=\"ltx_bibblock\">Embarrassingly shallow autoencoders for sparse data. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib42.1.1\">Proc. of WWW</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib43\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Sun et al<span class=\"ltx_text\" id=\"bib.bib43.2.2.1\">.</span> (2019)</span>\n<span class=\"ltx_bibblock\">\nFei Sun, Jun Liu, Jian Wu, Changhua Pei, Xiao Lin, Wenwu Ou, and Peng Jiang. 2019.\n\n</span>\n<span class=\"ltx_bibblock\">BERT4Rec: Sequential recommendation with bidirectional encoder representations from transformer. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib43.3.1\">Proc. of CIKM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib44\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Trefethen (2008)</span>\n<span class=\"ltx_bibblock\">\nLloyd N Trefethen. 2008.\n\n</span>\n<span class=\"ltx_bibblock\">Is Gauss quadrature better than Clenshaw–Curtis?\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib44.1.1\">SIAM review</em> (2008).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib45\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wang et al<span class=\"ltx_text\" id=\"bib.bib45.2.2.1\">.</span> (2022)</span>\n<span class=\"ltx_bibblock\">\nChenyang Wang, Yuanqing Yu, Weizhi Ma, Min Zhang, Chong Chen, Yiqun Liu, and Shaoping Ma. 2022.\n\n</span>\n<span class=\"ltx_bibblock\">Towards representation alignment and uniformity in collaborative filtering. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib45.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib46\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wang et al<span class=\"ltx_text\" id=\"bib.bib46.2.2.1\">.</span> (2020b)</span>\n<span class=\"ltx_bibblock\">\nChao Wang, Hengshu Zhu, Chen Zhu, Chuan Qin, and Hui Xiong. 2020b.\n\n</span>\n<span class=\"ltx_bibblock\">Setrank: A setwise bayesian approach for collaborative ranking from implicit feedback. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib46.3.1\">Proc. of AAAI</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib47\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wang et al<span class=\"ltx_text\" id=\"bib.bib47.2.2.1\">.</span> (2006)</span>\n<span class=\"ltx_bibblock\">\nJun Wang, Arjen P De Vries, and Marcel JT Reinders. 2006.\n\n</span>\n<span class=\"ltx_bibblock\">Unifying user-based and item-based collaborative filtering approaches by similarity fusion. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib47.3.1\">Proc. of SIGIR</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib48\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wang et al<span class=\"ltx_text\" id=\"bib.bib48.2.2.1\">.</span> (2020a)</span>\n<span class=\"ltx_bibblock\">\nJianling Wang, Kaize Ding, Liangjie Hong, Huan Liu, and James Caverlee. 2020a.\n\n</span>\n<span class=\"ltx_bibblock\">Next-item recommendation with sequential hypergraphs. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib48.3.1\">Proc. of SIGIR</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib49\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wang et al<span class=\"ltx_text\" id=\"bib.bib49.2.2.1\">.</span> (2019)</span>\n<span class=\"ltx_bibblock\">\nXiang Wang, Xiangnan He, Meng Wang, Fuli Feng, and Tat-Seng Chua. 2019.\n\n</span>\n<span class=\"ltx_bibblock\">Neural graph collaborative filtering. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib49.3.1\">Proc. of SIGIR</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib50\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wang et al<span class=\"ltx_text\" id=\"bib.bib50.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nYuhao Wang, Ha Tsz Lam, Yi Wong, Ziru Liu, Xiangyu Zhao, Yichao Wang, Bo Chen, Huifeng Guo, and Ruiming Tang. 2023.\n\n</span>\n<span class=\"ltx_bibblock\">Multi-Task Deep Recommender Systems: A Survey.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib50.3.1\">CoRR</em> (2023).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib51\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wu et al<span class=\"ltx_text\" id=\"bib.bib51.2.2.1\">.</span> (2019)</span>\n<span class=\"ltx_bibblock\">\nFelix Wu, Amauri Souza, Tianyi Zhang, Christopher Fifty, Tao Yu, and Kilian Weinberger. 2019.\n\n</span>\n<span class=\"ltx_bibblock\">Simplifying graph convolutional networks. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib51.3.1\">Proc. of ICML</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib52\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wu et al<span class=\"ltx_text\" id=\"bib.bib52.2.2.1\">.</span> (2021)</span>\n<span class=\"ltx_bibblock\">\nJiancan Wu, Xiang Wang, Fuli Feng, Xiangnan He, Liang Chen, Jianxun Lian, and Xing Xie. 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Self-supervised graph learning for recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib52.3.1\">Proc. of SIGIR</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib53\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wu et al<span class=\"ltx_text\" id=\"bib.bib53.2.2.1\">.</span> (2024)</span>\n<span class=\"ltx_bibblock\">\nLikang Wu, Zhaopeng Qiu, Zhi Zheng, Hengshu Zhu, and Enhong Chen. 2024.\n\n</span>\n<span class=\"ltx_bibblock\">Exploring large language model for graph data understanding in online job recommendations. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib53.3.1\">Proc. of AAAI</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib54\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Wu et al<span class=\"ltx_text\" id=\"bib.bib54.2.2.1\">.</span> (2016)</span>\n<span class=\"ltx_bibblock\">\nYao Wu, Christopher DuBois, Alice X Zheng, and Martin Ester. 2016.\n\n</span>\n<span class=\"ltx_bibblock\">Collaborative denoising auto-encoders for top-n recommender systems. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib54.3.1\">Proc. of WSDM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib55\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Xia et al<span class=\"ltx_text\" id=\"bib.bib55.2.2.1\">.</span> (2021)</span>\n<span class=\"ltx_bibblock\">\nXin Xia, Hongzhi Yin, Junliang Yu, Qinyong Wang, Lizhen Cui, and Xiangliang Zhang. 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Self-supervised hypergraph convolutional networks for session-based recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib55.3.1\">Proc. of AAAI</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib56\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Xie et al<span class=\"ltx_text\" id=\"bib.bib56.2.2.1\">.</span> (2022)</span>\n<span class=\"ltx_bibblock\">\nXu Xie, Fei Sun, Zhaoyang Liu, Shiwen Wu, Jinyang Gao, Jiandong Zhang, Bolin Ding, and Bin Cui. 2022.\n\n</span>\n<span class=\"ltx_bibblock\">Contrastive learning for sequential recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib56.3.1\">2022 IEEE 38th international conference on data engineering (ICDE)</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib57\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Yadalam et al<span class=\"ltx_text\" id=\"bib.bib57.2.2.1\">.</span> (2020)</span>\n<span class=\"ltx_bibblock\">\nTanya V Yadalam, Vaishnavi M Gowda, Vanditha Shiva Kumar, Disha Girish, and M Namratha. 2020.\n\n</span>\n<span class=\"ltx_bibblock\">Career recommendation systems using content based filtering. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib57.3.1\">2020 5th International Conference on Communication and Electronics Systems (ICCES)</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib58\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Yang et al<span class=\"ltx_text\" id=\"bib.bib58.2.2.1\">.</span> (2017)</span>\n<span class=\"ltx_bibblock\">\nShuo Yang, Mohammed Korayem, Khalifeh AlJadda, Trey Grainger, and Sriraam Natarajan. 2017.\n\n</span>\n<span class=\"ltx_bibblock\">Combining content-based and collaborative filtering for job recommendation system: A cost-sensitive Statistical Relational Learning approach.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib58.3.1\">Knowledge-Based Systems</em> (2017).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib59\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Yang et al<span class=\"ltx_text\" id=\"bib.bib59.2.2.1\">.</span> (2022)</span>\n<span class=\"ltx_bibblock\">\nYuhao Yang, Chao Huang, Lianghao Xia, and Chenliang Li. 2022.\n\n</span>\n<span class=\"ltx_bibblock\">Knowledge graph contrastive learning for recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib59.3.1\">Proc. of SIGIR</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib60\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Yao et al<span class=\"ltx_text\" id=\"bib.bib60.2.2.1\">.</span> (2021)</span>\n<span class=\"ltx_bibblock\">\nTiansheng Yao, Xinyang Yi, Derek Zhiyuan Cheng, Felix Yu, Ting Chen, Aditya Menon, Lichan Hong, Ed H Chi, Steve Tjoa, Jieqi Kang, et al<span class=\"ltx_text\" id=\"bib.bib60.3.1\">.</span> 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Self-supervised learning for large-scale item recommendations. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib60.4.1\">Proc. of CIKM</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib61\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Zhang et al<span class=\"ltx_text\" id=\"bib.bib61.2.2.1\">.</span> (2023)</span>\n<span class=\"ltx_bibblock\">\nChi Zhang, Rui Chen, Xiangyu Zhao, Qilong Han, and Li Li. 2023.\n\n</span>\n<span class=\"ltx_bibblock\">Denoising and Prompt-Tuning for Multi-Behavior Recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib61.3.1\">Proceedings of the ACM Web Conference 2023, WWW 2023, Austin, TX, USA, 30 April 2023 - 4 May 2023</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib62\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Zhang et al<span class=\"ltx_text\" id=\"bib.bib62.2.2.1\">.</span> (2020)</span>\n<span class=\"ltx_bibblock\">\nMengqi Zhang, Shu Wu, Meng Gao, Xin Jiang, Ke Xu, and Liang Wang. 2020.\n\n</span>\n<span class=\"ltx_bibblock\">Personalized graph neural networks with attention mechanism for session-aware recommendation.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib62.3.1\">IEEE Transactions on Knowledge and Data Engineering</em> (2020).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib63\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Zhao et al<span class=\"ltx_text\" id=\"bib.bib63.2.2.1\">.</span> (2016)</span>\n<span class=\"ltx_bibblock\">\nXiangyu Zhao, Tong Xu, Qi Liu, and Hao Guo. 2016.\n\n</span>\n<span class=\"ltx_bibblock\">Exploring the choice under conflict for social event participation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib63.3.1\">Proc. of DASFAA</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib64\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Zhao et al<span class=\"ltx_text\" id=\"bib.bib64.2.2.1\">.</span> (2018)</span>\n<span class=\"ltx_bibblock\">\nXiangyu Zhao, Liang Zhang, Zhuoye Ding, Long Xia, Jiliang Tang, and Dawei Yin. 2018.\n\n</span>\n<span class=\"ltx_bibblock\">Recommendations with negative feedback via pairwise deep reinforcement learning. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib64.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib65\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Zheng et al<span class=\"ltx_text\" id=\"bib.bib65.2.2.1\">.</span> (2021)</span>\n<span class=\"ltx_bibblock\">\nJiawei Zheng, Qianli Ma, Hao Gu, and Zhenjing Zheng. 2021.\n\n</span>\n<span class=\"ltx_bibblock\">Multi-view denoising graph auto-encoders on heterogeneous information networks for cold-start recommendation. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib65.3.1\">Proc. of KDD</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib66\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Zheng et al<span class=\"ltx_text\" id=\"bib.bib66.2.2.1\">.</span> (2024)</span>\n<span class=\"ltx_bibblock\">\nZhi Zheng, Xiao Hu, Zhaopeng Qiu, Yuan Cheng, Shanshan Gao, Yang Song, Hengshu Zhu, and Hui Xiong. 2024.\n\n</span>\n<span class=\"ltx_bibblock\">Bilateral Multi-Behavior Modeling for Reciprocal Recommendation in Online Recruitment.\n\n</span>\n<span class=\"ltx_bibblock\"><em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib66.3.1\">IEEE Transactions on Knowledge and Data Engineering</em> (2024).\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n<li class=\"ltx_bibitem\" id=\"bib.bib67\">\n<span class=\"ltx_tag ltx_role_refnum ltx_tag_bibitem\">Zheng et al<span class=\"ltx_text\" id=\"bib.bib67.2.2.1\">.</span> (2022)</span>\n<span class=\"ltx_bibblock\">\nZhi Zheng, Zhaopeng Qiu, Tong Xu, Xian Wu, Xiangyu Zhao, Enhong Chen, and Hui Xiong. 2022.\n\n</span>\n<span class=\"ltx_bibblock\">CBR: context bias aware recommendation for debiasing user modeling and click prediction. In <em class=\"ltx_emph ltx_font_italic\" id=\"bib.bib67.3.1\">Proceedings of the ACM Web Conference 2022</em>.\n\n</span>\n<span class=\"ltx_bibblock\">\n</span>\n</li>\n</ul>\n</section>\n<section class=\"ltx_appendix\" id=\"A1\">\n<h2 class=\"ltx_title ltx_title_appendix\">\n<span class=\"ltx_tag ltx_tag_appendix\">Appendix A </span>Notations</h2>\n<div class=\"ltx_para\" id=\"A1.p1\">\n<p class=\"ltx_p\" id=\"A1.p1.1\">We summarize all notations in this paper and list them in Table <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#A1\" title=\"Appendix A Notations ‣ 6. Conclusion ‣ 5.5. Parametric Study (RQ4) ‣ 5.4. Ablation Study (RQ2, RQ3) ‣ 5.3. Overall Performance (RQ1) ‣ 5.2. Experimental Settings ‣ 5.1. Dataset ‣ 5. Experiments ‣ Adapting Job Recommendations to User Preference Drift with Behavioral-Semantic Fusion Learning\">A ###reference_###</a>.</p>\n</div>\n<figure class=\"ltx_table\" id=\"A1.tab1\">\n<figcaption class=\"ltx_caption\"><span class=\"ltx_tag ltx_tag_table\"><span class=\"ltx_text\" id=\"A1.tab1.1.1.1\" style=\"font-size:90%;\">Table 6</span>. </span><span class=\"ltx_text\" id=\"A1.tab1.2.2\" style=\"font-size:90%;\">Notations in this paper.</span></figcaption><div class=\"ltx_flex_figure\">\n<div class=\"ltx_flex_cell ltx_flex_size_1\"><span class=\"ltx_ERROR ltx_figure_panel undefined\" id=\"A1.tab1.3\">\\csvreader</span></div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A1.tab1.4\">[\nseparator=semicolon,\ntabular=cc,\ntable head= <span class=\"ltx_text ltx_font_bold\" id=\"A1.tab1.4.1\">Notation Description \n<br class=\"ltx_break\"/>,\nlate after last line= \n<br class=\"ltx_break\"/>]data/notation.txt<span class=\"ltx_ERROR undefined\" id=\"A1.tab1.4.1.1\">\\csvlinetotablerow</span>\n</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<section class=\"ltx_appendix ltx_figure_panel\" id=\"A2\">\n<h2 class=\"ltx_title ltx_title_appendix\">\n<span class=\"ltx_tag ltx_tag_appendix\">Appendix B </span>Model Complexity</h2>\n<div class=\"ltx_para\" id=\"A2.p1\">\n<p class=\"ltx_p\" id=\"A2.p1.3\">Our proposed framework, BISTRO, is efficient. We will analyze it from two aspects: theoretical [from to ] and application (less than per sample) level.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A2.p2\">\n<p class=\"ltx_p\" id=\"A2.p2.1\"><span class=\"ltx_text ltx_font_bold\" id=\"A2.p2.1.1\">Theoretical level</span></p>\n</div>\n<div class=\"ltx_para\" id=\"A2.p3\">\n<p class=\"ltx_p\" id=\"A2.p3.2\">Among all modules in BISTRO, the graph neural network (GNN) is considered very time-consuming. Actually, graph learning-based recommender systems have computational limitations in practice. To address this issue, we cluster users (jobs) based on the semantic information in their resumes (requirements) using the K-Means algorithm and use a simple RNN to extract the personalized preference for a person in the user group. The extraction of preference features based on user (job) groups reduces the computational overhead of GNNs. Noting that eigendecomposition in GNNs is resource-intensive, we place it by using the Chebyshev polynomial estimation, and the Chebyshev coefficient in the polynomial can be computed by Fast Fourier Transform, which reduces the computational complexity from exponential complexity to . Therefore, our algorithm is efficient.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A2.p4\">\n<p class=\"ltx_p\" id=\"A2.p4.1\"><span class=\"ltx_text ltx_font_bold\" id=\"A2.p4.1.1\">Application level</span></p>\n</div>\n<div class=\"ltx_para\" id=\"A2.p5\">\n<p class=\"ltx_p\" id=\"A2.p5.1\">In practice, the training of all models is performed offline. For example, we use spark cluster to calculate the clustering center of each group and use HGNN to learn the corresponding representation of groups. For new or updated users and jobs, we assign them to the nearest group based on semantic clustering. Only the RNN module operates online, inferring personalized user representations within groups. In the online experiment, the 99% Response Time of BISTRO is less than .</p>\n</div>\n<section class=\"ltx_appendix\" id=\"A3\">\n<h2 class=\"ltx_title ltx_title_appendix\">\n<span class=\"ltx_tag ltx_tag_appendix\">Appendix C </span>Experiment Detail</h2>\n<section class=\"ltx_subsection\" id=\"A3.SS1\">\n<h3 class=\"ltx_title ltx_title_subsection\">\n<span class=\"ltx_tag ltx_tag_subsection\">C.1. </span>Baselines Detail</h3>\n<div class=\"ltx_para\" id=\"A3.SS1.p1\">\n<p class=\"ltx_p\" id=\"A3.SS1.p1.1\">The details of these baselines are as follows:</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p2\">\n<p class=\"ltx_p\" id=\"A3.SS1.p2.1\"> BasicMF <cite class=\"ltx_cite ltx_citemacro_citep\">(Koren et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib21\" title=\"\">2009 ###reference_b21### ###reference_b21### ###reference_b21### ###reference_b21### ###reference_b21###</a>)</cite>: A model that combines matrix factorization with a Multilayer Perceptron (MLP) for recommendations.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p3\">\n<p class=\"ltx_p\" id=\"A3.SS1.p3.1\"> ItemKNN <cite class=\"ltx_cite ltx_citemacro_citep\">(Wang et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib47\" title=\"\">2006 ###reference_b47### ###reference_b47### ###reference_b47### ###reference_b47### ###reference_b47###</a>)</cite>: A recommender that utilizes item-based collaborative filtering.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p4\">\n<p class=\"ltx_p\" id=\"A3.SS1.p4.1\"> PureSVD <cite class=\"ltx_cite ltx_citemacro_citep\">(Cremonesi et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib8\" title=\"\">2010 ###reference_b8### ###reference_b8### ###reference_b8### ###reference_b8### ###reference_b8###</a>)</cite>: An approach that applies Singular Value Decomposition for recommendation tasks.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p5\">\n<p class=\"ltx_p\" id=\"A3.SS1.p5.1\"> SLIM <cite class=\"ltx_cite ltx_citemacro_citep\">(Ning and Karypis, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib35\" title=\"\">2011 ###reference_b35### ###reference_b35### ###reference_b35### ###reference_b35### ###reference_b35###</a>)</cite>: A recommendation method known as the Sparse Linear Method.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p6\">\n<p class=\"ltx_p\" id=\"A3.SS1.p6.1\"> DAE <cite class=\"ltx_cite ltx_citemacro_citep\">(Wu et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib54\" title=\"\">2016 ###reference_b54### ###reference_b54### ###reference_b54### ###reference_b54### ###reference_b54###</a>)</cite>: Stands for Collaborative Denoising Auto-Encoder, used in recommendation systems.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p7\">\n<p class=\"ltx_p\" id=\"A3.SS1.p7.1\"> MultVAE <cite class=\"ltx_cite ltx_citemacro_citep\">(Liang et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib26\" title=\"\">2018 ###reference_b26### ###reference_b26### ###reference_b26### ###reference_b26### ###reference_b26###</a>)</cite>: A model extending Variational Autoencoders to collaborative filtering for implicit feedback.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p8\">\n<p class=\"ltx_p\" id=\"A3.SS1.p8.1\"> EASE <cite class=\"ltx_cite ltx_citemacro_citep\">(Steck, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib42\" title=\"\">2019 ###reference_b42### ###reference_b42### ###reference_b42### ###reference_b42### ###reference_b42###</a>)</cite>: A recommendation technique called Embarrassingly Shallow Autoencoders for Sparse Data.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p9\">\n<p class=\"ltx_p\" id=\"A3.SS1.p9.1\"> P3a <cite class=\"ltx_cite ltx_citemacro_citep\">(Cooper et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib7\" title=\"\">2014 ###reference_b7### ###reference_b7### ###reference_b7### ###reference_b7### ###reference_b7###</a>)</cite>: A method that uses ordering rules from random walks on a user-item graph.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p10\">\n<p class=\"ltx_p\" id=\"A3.SS1.p10.1\"> RP3b <cite class=\"ltx_cite ltx_citemacro_citep\">(Paudel et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib37\" title=\"\">2016 ###reference_b37### ###reference_b37### ###reference_b37### ###reference_b37### ###reference_b37###</a>)</cite>: A recommender that re-ranks items based on 3-hop random walk transition probabilities.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p11\">\n<p class=\"ltx_p\" id=\"A3.SS1.p11.1\"> NGCF <cite class=\"ltx_cite ltx_citemacro_citep\">(Wang et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib49\" title=\"\">2019 ###reference_b49### ###reference_b49### ###reference_b49### ###reference_b49### ###reference_b49###</a>)</cite>: Employs graph embedding propagation layers to generate user/item representations.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p12\">\n<p class=\"ltx_p\" id=\"A3.SS1.p12.1\"> LightGCN <cite class=\"ltx_cite ltx_citemacro_citep\">(He et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib11\" title=\"\">2020 ###reference_b11### ###reference_b11### ###reference_b11### ###reference_b11### ###reference_b11###</a>)</cite>: Utilizes neighborhood information in the user-item interaction graph.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p13\">\n<p class=\"ltx_p\" id=\"A3.SS1.p13.1\"> SLRec <cite class=\"ltx_cite ltx_citemacro_citep\">(Yao et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib60\" title=\"\">2021 ###reference_b60### ###reference_b60### ###reference_b60### ###reference_b60### ###reference_b60###</a>)</cite>: A method using contrastive learning among node features.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p14\">\n<p class=\"ltx_p\" id=\"A3.SS1.p14.1\"> SGL <cite class=\"ltx_cite ltx_citemacro_citep\">(Wu et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib52\" title=\"\">2021 ###reference_b52### ###reference_b52### ###reference_b52### ###reference_b52### ###reference_b52###</a>)</cite>: Enhances LightGCN with self-supervised contrastive learning.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p15\">\n<p class=\"ltx_p\" id=\"A3.SS1.p15.1\"> GCCF <cite class=\"ltx_cite ltx_citemacro_citep\">(Chen et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib6\" title=\"\">2020 ###reference_b6### ###reference_b6### ###reference_b6### ###reference_b6### ###reference_b6###</a>)</cite>: A multi-layer graph convolutional network for recommendation.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p16\">\n<p class=\"ltx_p\" id=\"A3.SS1.p16.1\"> NCL <cite class=\"ltx_cite ltx_citemacro_citep\">(Lin et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib29\" title=\"\">2022 ###reference_b29### ###reference_b29### ###reference_b29### ###reference_b29### ###reference_b29###</a>)</cite>: Enhances recommendation models with neighborhood-enriched contrastive learning.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p17\">\n<p class=\"ltx_p\" id=\"A3.SS1.p17.1\"> DirectAU <cite class=\"ltx_cite ltx_citemacro_citep\">(Wang et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib45\" title=\"\">2022 ###reference_b45### ###reference_b45### ###reference_b45### ###reference_b45### ###reference_b45###</a>)</cite>: Focuses on the quality of representation based on alignment and uniformity.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p18\">\n<p class=\"ltx_p\" id=\"A3.SS1.p18.1\"> HG-GNN <cite class=\"ltx_cite ltx_citemacro_citep\">(Pang et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib36\" title=\"\">2022 ###reference_b36### ###reference_b36### ###reference_b36### ###reference_b36### ###reference_b36###</a>)</cite>: Constructs a heterogeneous graph with both user nodes and item nodes and uses a graph neural network to learn the embedding of nodes as a potential representation of users or items.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p19\">\n<p class=\"ltx_p\" id=\"A3.SS1.p19.1\"> A-PGNN <cite class=\"ltx_cite ltx_citemacro_citep\">(Zhang et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib62\" title=\"\">2020 ###reference_b62### ###reference_b62### ###reference_b62### ###reference_b62### ###reference_b62###</a>)</cite>: Uses GNN to extract session representations for intra-session interactions and uses an attention mechanism to learn features between sessions.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p20\">\n<p class=\"ltx_p\" id=\"A3.SS1.p20.1\"> AdaGCL <cite class=\"ltx_cite ltx_citemacro_citep\">(Jiang et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib19\" title=\"\">2023 ###reference_b19### ###reference_b19### ###reference_b19### ###reference_b19### ###reference_b19###</a>)</cite>: Combines a graph generator and a graph denoising model for contrastive views.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p21\">\n<p class=\"ltx_p\" id=\"A3.SS1.p21.1\"> MvDGAE <cite class=\"ltx_cite ltx_citemacro_citep\">(Zheng et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib65\" title=\"\">2021 ###reference_b65### ###reference_b65### ###reference_b65### ###reference_b65### ###reference_b65###</a>)</cite>: Stands for Multi-view Denoising Graph AutoEncoders.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p22\">\n<p class=\"ltx_p\" id=\"A3.SS1.p22.1\"> STAMP <cite class=\"ltx_cite ltx_citemacro_citep\">(Liu et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib32\" title=\"\">2018 ###reference_b32### ###reference_b32### ###reference_b32### ###reference_b32### ###reference_b32###</a>)</cite>: A model based on the attention mechanism to model user behavior sequence data.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p23\">\n<p class=\"ltx_p\" id=\"A3.SS1.p23.1\"> GRU4Rec <cite class=\"ltx_cite ltx_citemacro_citep\">(Hidasi et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib12\" title=\"\">2016 ###reference_b12### ###reference_b12### ###reference_b12### ###reference_b12### ###reference_b12###</a>)</cite>: Utilizes Gated Recurrent Units for session-based recommendations.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p24\">\n<p class=\"ltx_p\" id=\"A3.SS1.p24.1\"> BERT4Rec <cite class=\"ltx_cite ltx_citemacro_citep\">(Sun et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib43\" title=\"\">2019 ###reference_b43### ###reference_b43### ###reference_b43### ###reference_b43### ###reference_b43###</a>)</cite>: A model for the sequence-based recommendation that handles long user behavior sequences.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p25\">\n<p class=\"ltx_p\" id=\"A3.SS1.p25.1\"> CL4Rec <cite class=\"ltx_cite ltx_citemacro_citep\">(Xie et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib56\" title=\"\">2022 ###reference_b56### ###reference_b56### ###reference_b56### ###reference_b56### ###reference_b56###</a>)</cite>: An improved version of BERT4Rec with locality-sensitive hashing for faster item retrieval.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p26\">\n<p class=\"ltx_p\" id=\"A3.SS1.p26.1\"> CoScRec <cite class=\"ltx_cite ltx_citemacro_citep\">(Liu et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib33\" title=\"\">2021 ###reference_b33### ###reference_b33### ###reference_b33### ###reference_b33### ###reference_b33###</a>)</cite>: It explores an innovative recommendation approach that enhances sequential recommendation systems through robust data augmentation and contrastive self-supervised learning techniques.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS1.p27\">\n<p class=\"ltx_p\" id=\"A3.SS1.p27.1\"> TiCoSeRec <cite class=\"ltx_cite ltx_citemacro_citep\">(Dang et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib10\" title=\"\">2023 ###reference_b10### ###reference_b10### ###reference_b10### ###reference_b10### ###reference_b10###</a>)</cite>: A method based on CoSeRec, utilizing data augmentation algorithms for sequence recommendation improvement.</p>\n</div>\n</section>\n<section class=\"ltx_subsection\" id=\"A3.SS2\">\n<h3 class=\"ltx_title ltx_title_subsection\">\n<span class=\"ltx_tag ltx_tag_subsection\">C.2. </span>The number of recommended jobs</h3>\n<div class=\"ltx_para\" id=\"A3.SS2.p1\">\n<p class=\"ltx_p\" id=\"A3.SS2.p1.3\">The hyperparameter, , also has a critical impact on experimental results.\nWe set and to conduct experiments respectively.\nThe experimental results are shown in Table <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#A3.SS2\" title=\"C.2. The number of recommended jobs ‣ Appendix C Experiment Detail ‣ Appendix B Model Complexity ‣ Appendix A Notations ‣ 6. Conclusion ‣ 5.5. Parametric Study (RQ4) ‣ 5.4. Ablation Study (RQ2, RQ3) ‣ 5.3. Overall Performance (RQ1) ‣ 5.2. Experimental Settings ‣ 5.1. Dataset ‣ 5. Experiments ‣ Adapting Job Recommendations to User Preference Drift with Behavioral-Semantic Fusion Learning\">C.2 ###reference_### ###reference_### ###reference_### ###reference_### ###reference_###</a>.\nIt can be seen our model performs well under all settings.</p>\n</div>\n<figure class=\"ltx_table\" id=\"A3.SS2.5\">\n<figcaption class=\"ltx_caption\"><span class=\"ltx_tag ltx_tag_table\"><span class=\"ltx_text\" id=\"A3.SS2.5.6.2.1\" style=\"font-size:90%;\">Table 7</span>. </span><span class=\"ltx_text\" id=\"A3.SS2.2.2.1\" style=\"font-size:90%;\">Results under different settings of .</span></figcaption><div class=\"ltx_flex_figure\">\n<div class=\"ltx_flex_cell ltx_flex_size_1\"><span class=\"ltx_ERROR ltx_figure_panel undefined\" id=\"A3.SS2.5.7\">\\csvreader</span></div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS2.5.8\">[\ntabular=cccc,\ntable head=</p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS2.5.5\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS2.5.5.2\"> </span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS2.5.9\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS2.5.9.1\">,\nlate after line=\n<br class=\"ltx_break\"/>,\nlate after last line= \n<br class=\"ltx_break\"/>]data/exp_para_k.txt<span class=\"ltx_ERROR undefined\" id=\"A3.SS2.5.9.1.1\">\\csvlinetotablerow</span>\n</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<section class=\"ltx_subsection ltx_figure_panel\" id=\"A3.SS3\">\n<h3 class=\"ltx_title ltx_title_subsection\">\n<span class=\"ltx_tag ltx_tag_subsection\">C.3. </span>Case Study</h3>\n<figure class=\"ltx_figure\" id=\"A3.F5\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"A3.F5.1\" style=\"width:433.6pt;height:305.6pt;vertical-align:-0.0pt;\"><span class=\"ltx_transformed_inner\" style=\"transform:translate(71.8pt,-50.6pt) scale(1.4954229650144,1.4954229650144) ;\"><svg class=\"ltx_picture\" height=\"282.8\" id=\"A3.F5.1.pic1\" overflow=\"visible\" version=\"1.1\" width=\"392\"><g transform=\"translate(0,282.8) matrix(1 0 0 -1 0 0) translate(56,0) translate(0,14.51)\"><g fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#7771A4\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 49.91 69.07 L -49.91 69.07 C -52.97 69.07 -55.44 66.59 -55.44 63.54 L -55.44 -8.42 C -55.44 -11.47 -52.97 -13.95 -49.91 -13.95 L 49.91 -13.95 C 52.97 -13.95 55.44 -11.47 55.44 -8.42 L 55.44 63.54 C 55.44 66.59 52.97 69.07 49.91 69.07 Z M -55.44 -13.95\" style=\"stroke:none\"></path></g><g fill=\"#ABA4C3\" stroke=\"#7771A4\"><path d=\"M 49.91 69.07 L -49.91 69.07 C -52.97 69.07 -55.44 66.59 -55.44 63.54 L -55.44 -8.42 C -55.44 -11.47 -52.97 -13.95 -49.91 -13.95 L 49.91 -13.95 C 52.97 -13.95 55.44 -11.47 55.44 -8.42 L 55.44 63.54 C 55.44 66.59 52.97 69.07 49.91 69.07 Z M -55.44 -13.95\"></path></g><g fill=\"#FFFFFF\" stroke=\"#FFFFFF\" transform=\"matrix(1.0 0.0 0.0 1.0 -50.83 8.63)\"><g class=\"ltx_tikzmatrix\" color=\"#FFFFFF\" transform=\"matrix(1 0 0 -1 0 31.72)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 25.57)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 19.42)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 13.27)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 7.12)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 10.58)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Skill Req: bb+cc</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 22.88)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Degree Req: dd</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 35.18)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Experience: hh</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#B783AF\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 36.65 62.77 L -36.65 62.77 C -39.71 62.77 -42.18 60.29 -42.18 57.23 L -42.18 45.13 C -42.18 42.08 -39.71 39.6 -36.65 39.6 L 36.65 39.6 C 39.71 39.6 42.18 42.08 42.18 45.13 L 42.18 57.23 C 42.18 60.29 39.71 62.77 36.65 62.77 Z M -42.18 39.6\" style=\"stroke:none\"></path></g><g fill=\"#CEB2CB\" stroke=\"#B783AF\"><path d=\"M 36.65 62.77 L -36.65 62.77 C -39.71 62.77 -42.18 60.29 -42.18 57.23 L -42.18 45.13 C -42.18 42.08 -39.71 39.6 -36.65 39.6 L 36.65 39.6 C 39.71 39.6 42.18 42.08 42.18 45.13 L 42.18 57.23 C 42.18 60.29 39.71 62.77 36.65 62.77 Z M -42.18 39.6\"></path></g><g fill=\"#FFFFFF\" stroke=\"#FFFFFF\" transform=\"matrix(1.0 0.0 0.0 1.0 -37.57 44.21)\"><g class=\"ltx_tikzmatrix\" color=\"#FFFFFF\" transform=\"matrix(1 0 0 -1 0 7.865)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 11.26)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Location: gg</text></g></g></g></g><g stroke=\"#F5A673\"><g fill=\"#808080\" fill-opacity=\"0.5\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 134.64 27.56 L 94.49 67.71 L 54.34 27.56 L 94.49 -12.59 Z\" style=\"stroke:none\"></path></g><g fill=\"#F5C9A9\" stroke=\"#F5A673\"><path d=\"M 134.64 27.56 L 94.49 67.71 L 54.34 27.56 L 94.49 -12.59 Z\"></path></g><g fill=\"#F5C9A9\"><foreignobject height=\"0\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"0\"><span class=\"ltx_text\" id=\"A3.F5.1.pic1.1.1.1.1.1.1.1\"></span></foreignobject></g></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 97.47 24.58)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 15.61)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 10.42)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 9.42 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 5.69)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 10.43)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">JID</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 20.8)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">***872</text></g></g></g></g><g stroke=\"#F5A673\"><g fill=\"#808080\" fill-opacity=\"0.5\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 229.13 27.56 L 188.98 67.71 L 148.83 27.56 L 188.98 -12.59 Z\" style=\"stroke:none\"></path></g><g fill=\"#F5C9A9\" stroke=\"#F5A673\"><path d=\"M 229.13 27.56 L 188.98 67.71 L 148.83 27.56 L 188.98 -12.59 Z\"></path></g><g fill=\"#F5C9A9\"><foreignobject height=\"0\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"0\"><span class=\"ltx_text\" id=\"A3.F5.1.pic1.2.2.2.2.1.1.1\"></span></foreignobject></g></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 191.96 24.58)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 15.61)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 10.42)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 9.42 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 5.69)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 10.43)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">JID</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 20.8)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">***994</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#7771A4\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 329.91 69.07 L 237.01 69.07 C 233.96 69.07 231.48 66.59 231.48 63.54 L 231.48 -8.42 C 231.48 -11.47 233.96 -13.95 237.01 -13.95 L 329.91 -13.95 C 332.97 -13.95 335.45 -11.47 335.45 -8.42 L 335.45 63.54 C 335.45 66.59 332.97 69.07 329.91 69.07 Z M 231.48 -13.95\" style=\"stroke:none\"></path></g><g fill=\"#ABA4C3\" stroke=\"#7771A4\"><path d=\"M 329.91 69.07 L 237.01 69.07 C 233.96 69.07 231.48 66.59 231.48 63.54 L 231.48 -8.42 C 231.48 -11.47 233.96 -13.95 237.01 -13.95 L 329.91 -13.95 C 332.97 -13.95 335.45 -11.47 335.45 -8.42 L 335.45 63.54 C 335.45 66.59 332.97 69.07 329.91 69.07 Z M 231.48 -13.95\"></path></g><g fill=\"#FFFFFF\" stroke=\"#FFFFFF\" transform=\"matrix(1.0 0.0 0.0 1.0 236.09 8.63)\"><g class=\"ltx_tikzmatrix\" color=\"#FFFFFF\" transform=\"matrix(1 0 0 -1 0 31.72)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 25.57)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 19.42)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 13.27)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 7.12)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 10.58)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Skill Req: ee+ff</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 22.88)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Degree Req: dd</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 35.18)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Experience: hh</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#B783AF\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 320.11 62.77 L 246.82 62.77 C 243.76 62.77 241.28 60.29 241.28 57.23 L 241.28 45.13 C 241.28 42.08 243.76 39.6 246.82 39.6 L 320.11 39.6 C 323.17 39.6 325.65 42.08 325.65 45.13 L 325.65 57.23 C 325.65 60.29 323.17 62.77 320.11 62.77 Z M 241.28 39.6\" style=\"stroke:none\"></path></g><g fill=\"#CEB2CB\" stroke=\"#B783AF\"><path d=\"M 320.11 62.77 L 246.82 62.77 C 243.76 62.77 241.28 60.29 241.28 57.23 L 241.28 45.13 C 241.28 42.08 243.76 39.6 246.82 39.6 L 320.11 39.6 C 323.17 39.6 325.65 42.08 325.65 45.13 L 325.65 57.23 C 325.65 60.29 323.17 62.77 320.11 62.77 Z M 241.28 39.6\"></path></g><g fill=\"#FFFFFF\" stroke=\"#FFFFFF\" transform=\"matrix(1.0 0.0 0.0 1.0 245.89 44.21)\"><g class=\"ltx_tikzmatrix\" color=\"#FFFFFF\" transform=\"matrix(1 0 0 -1 0 7.865)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 11.26)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Location: gg</text></g></g></g></g><g stroke=\"#F5A673\"><g fill=\"#808080\" fill-opacity=\"0.5\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 42.09 137.8 L 0 179.88 L -42.09 137.8 L 0 95.71 Z\" style=\"stroke:none\"></path></g><g fill=\"#F5C9A9\" stroke=\"#F5A673\"><path d=\"M 42.09 137.8 L 0 179.88 L -42.09 137.8 L 0 95.71 Z\"></path></g><g fill=\"#F5C9A9\"><foreignobject height=\"0\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"0\"><span class=\"ltx_text\" id=\"A3.F5.1.pic1.3.3.3.3.1.1.1\"></span></foreignobject></g></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 134.82)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 19.49)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 9.42 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 9.57)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">JID</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 24.68)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">***265</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#FCDB72\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 124.98 137.8 C 124.98 154.63 111.33 168.28 94.49 168.28 C 77.65 168.28 64 154.63 64 137.8 C 64 120.96 77.65 107.31 94.49 107.31 C 111.33 107.31 124.98 120.96 124.98 137.8 Z M 94.49 137.8\" style=\"stroke:none\"></path></g><g fill=\"#F9E6AF\" stroke=\"#FCDB72\"><path d=\"M 124.98 137.8 C 124.98 154.63 111.33 168.28 94.49 168.28 C 77.65 168.28 64 154.63 64 137.8 C 64 120.96 77.65 107.31 94.49 107.31 C 111.33 107.31 124.98 120.96 124.98 137.8 Z M 94.49 137.8\"></path></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 73.73 125.46)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 19.49)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 7.78 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 9.57)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">UID</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 24.68)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">***175</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#FCDB72\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 219.46 137.8 C 219.46 154.63 205.81 168.28 188.98 168.28 C 172.14 168.28 158.49 154.63 158.49 137.8 C 158.49 120.96 172.14 107.31 188.98 107.31 C 205.81 107.31 219.46 120.96 219.46 137.8 Z M 188.98 137.8\" style=\"stroke:none\"></path></g><g fill=\"#F9E6AF\" stroke=\"#FCDB72\"><path d=\"M 219.46 137.8 C 219.46 154.63 205.81 168.28 188.98 168.28 C 172.14 168.28 158.49 154.63 158.49 137.8 C 158.49 120.96 172.14 107.31 188.98 107.31 C 205.81 107.31 219.46 120.96 219.46 137.8 Z M 188.98 137.8\"></path></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 168.22 125.46)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 19.49)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 7.78 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 9.57)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">UID</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 24.68)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">***175</text></g></g></g></g><g stroke=\"#F5A673\"><g fill=\"#808080\" fill-opacity=\"0.5\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 325.55 137.8 L 283.46 179.88 L 241.38 137.8 L 283.46 95.71 Z\" style=\"stroke:none\"></path></g><g fill=\"#F5C9A9\" stroke=\"#F5A673\"><path d=\"M 325.55 137.8 L 283.46 179.88 L 241.38 137.8 L 283.46 95.71 Z\"></path></g><g fill=\"#F5C9A9\"><foreignobject height=\"0\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"0\"><span class=\"ltx_text\" id=\"A3.F5.1.pic1.4.4.4.4.1.1.1\"></span></foreignobject></g></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 286.44 134.82)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 19.49)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 9.42 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 9.57)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">JID</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 24.68)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">***523</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#FCDB72\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 31.52 236.22 C 31.52 253.63 17.41 267.74 0 267.74 C -17.41 267.74 -31.52 253.63 -31.52 236.22 C -31.52 218.81 -17.41 204.7 0 204.7 C 17.41 204.7 31.52 218.81 31.52 236.22 Z M 0 236.22\" style=\"stroke:none\"></path></g><g fill=\"#F9E6AF\" stroke=\"#FCDB72\"><path d=\"M 31.52 236.22 C 31.52 253.63 17.41 267.74 0 267.74 C -17.41 267.74 -31.52 253.63 -31.52 236.22 C -31.52 218.81 -17.41 204.7 0 204.7 C 17.41 204.7 31.52 218.81 31.52 236.22 Z M 0 236.22\"></path></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 -20.76 222.15)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 22.95)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 17.76)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 7.78 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 13.03)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 17.76)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">UID</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 28.14)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">***479</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#7771A4\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 130.46 263.89 L 58.51 263.89 C 55.45 263.89 52.98 261.42 52.98 258.36 L 52.98 214.08 C 52.98 211.02 55.45 208.55 58.51 208.55 L 130.46 208.55 C 133.52 208.55 136 211.02 136 214.08 L 136 258.36 C 136 261.42 133.52 263.89 130.46 263.89 Z M 52.98 208.55\" style=\"stroke:none\"></path></g><g fill=\"#ABA4C3\" stroke=\"#7771A4\"><path d=\"M 130.46 263.89 L 58.51 263.89 C 55.45 263.89 52.98 261.42 52.98 258.36 L 52.98 214.08 C 52.98 211.02 55.45 208.55 58.51 208.55 L 130.46 208.55 C 133.52 208.55 136 211.02 136 214.08 L 136 258.36 C 136 261.42 133.52 263.89 130.46 263.89 Z M 52.98 208.55\"></path></g><g fill=\"#FFFFFF\" stroke=\"#FFFFFF\" transform=\"matrix(1.0 0.0 0.0 1.0 58.93 213.7)\"><g class=\"ltx_tikzmatrix\" color=\"#FFFFFF\" transform=\"matrix(1 0 0 -1 0 38.9)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 32.75)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 26.6)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 20.45)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 14.375)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 17.76)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Age: aa</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 30.06)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Skill: bb, cc</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 42.36)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Degree: dd</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#7771A4\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 224.95 263.89 L 153 263.89 C 149.94 263.89 147.47 261.42 147.47 258.36 L 147.47 214.08 C 147.47 211.02 149.94 208.55 153 208.55 L 224.95 208.55 C 228.01 208.55 230.49 211.02 230.49 214.08 L 230.49 258.36 C 230.49 261.42 228.01 263.89 224.95 263.89 Z M 147.47 208.55\" style=\"stroke:none\"></path></g><g fill=\"#ABA4C3\" stroke=\"#7771A4\"><path d=\"M 224.95 263.89 L 153 263.89 C 149.94 263.89 147.47 261.42 147.47 258.36 L 147.47 214.08 C 147.47 211.02 149.94 208.55 153 208.55 L 224.95 208.55 C 228.01 208.55 230.49 211.02 230.49 214.08 L 230.49 258.36 C 230.49 261.42 228.01 263.89 224.95 263.89 Z M 147.47 208.55\"></path></g><g fill=\"#FFFFFF\" stroke=\"#FFFFFF\" transform=\"matrix(1.0 0.0 0.0 1.0 156.38 213.7)\"><g class=\"ltx_tikzmatrix\" color=\"#FFFFFF\" transform=\"matrix(1 0 0 -1 0 38.9)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 32.75)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 26.6)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 20.45)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 14.375)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 17.76)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Age: aa</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 30.06)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Skill: ee, ff</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 42.36)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Degree: dd</text></g></g></g></g><g fill=\"#808080\" fill-opacity=\"0.5\" stroke=\"#FCDB72\" stroke-opacity=\"0.5\" transform=\"matrix(1.0 0.0 0.0 1.0 2.98 -2.98)\"><path d=\"M 314.99 236.22 C 314.99 253.63 300.88 267.74 283.46 267.74 C 266.05 267.74 251.94 253.63 251.94 236.22 C 251.94 218.81 266.05 204.7 283.46 204.7 C 300.88 204.7 314.99 218.81 314.99 236.22 Z M 283.46 236.22\" style=\"stroke:none\"></path></g><g fill=\"#F9E6AF\" stroke=\"#FCDB72\"><path d=\"M 314.99 236.22 C 314.99 253.63 300.88 267.74 283.46 267.74 C 266.05 267.74 251.94 253.63 251.94 236.22 C 251.94 218.81 266.05 204.7 283.46 204.7 C 300.88 204.7 314.99 218.81 314.99 236.22 Z M 283.46 236.22\"></path></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 262.71 222.15)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 22.95)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 17.76)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 7.78 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 13.03)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 17.76)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">UID</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 28.14)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">***013</text></g></g></g></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 112.58 -4.8)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 4.805)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 9.61)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">New Jobs</text></g></g></g></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 118.15 125.92)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 19.02)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 4.04 0)\"><g class=\"ltx_tikzmatrix\" transform=\"matrix(1 0 0 -1 0 9.57)\"><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 14.3)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Revise</text></g></g></g></g></g><g class=\"ltx_tikzmatrix_row\" transform=\"matrix(1 0 0 1 0 23.76)\"><g class=\"ltx_tikzmatrix_col ltx_nopad_l ltx_nopad_r\" transform=\"matrix(1 0 0 -1 0 0)\"><text transform=\"matrix(1 0 0 -1 0 0)\">Resume</text></g></g></g></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 35.81 69.21)\"><foreignobject height=\"38.62\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"38.62\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"A3.F5.1.pic1.5.5.5.5.1.1\" style=\"width:27.9pt;height:27.9pt;vertical-align:-1.4pt;\"><span class=\"ltx_transformed_inner\" style=\"width:30.7pt;transform:translate(-1.39pt,-7.22pt) rotate(-45deg) ;\">\n<p class=\"ltx_p\" id=\"A3.F5.1.pic1.5.5.5.5.1.1.1\">Expect</p>\n</span></div></foreignobject></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 209.04 99.24)\"><foreignobject height=\"38.62\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"38.62\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"A3.F5.1.pic1.6.6.6.6.1.1\" style=\"width:27.9pt;height:27.9pt;vertical-align:-23.1pt;\"><span class=\"ltx_transformed_inner\" style=\"width:30.7pt;transform:translate(-1.39pt,-10.6pt) rotate(45deg) ;\">\n<p class=\"ltx_p\" id=\"A3.F5.1.pic1.6.6.6.6.1.1.1\">Expect</p>\n</span></div></foreignobject></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 39.74 140.86)\"><foreignobject height=\"9.61\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"30.75\"><span class=\"ltx_text\" id=\"A3.F5.1.pic1.7.7.7.7.1.1\">Click</span></foreignobject></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 223.06 140.94)\"><foreignobject height=\"9.46\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"26.33\"><span class=\"ltx_text\" id=\"A3.F5.1.pic1.8.8.8.8.1.1\">Rec.</span></foreignobject></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 3.07 177.54)\"><foreignobject height=\"30.75\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"9.61\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"A3.F5.1.pic1.9.9.9.9.1.1\" style=\"width:6.9pt;height:22.2pt;vertical-align:-0.0pt;\"><span class=\"ltx_transformed_inner\" style=\"width:22.2pt;transform:translate(-7.64pt,-7.64pt) rotate(-90deg) ;\">\n<p class=\"ltx_p\" id=\"A3.F5.1.pic1.9.9.9.9.1.1.1\">Click</p>\n</span></div></foreignobject></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 270.79 208.29)\"><foreignobject height=\"30.75\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"9.61\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"A3.F5.1.pic1.10.10.10.10.1.1\" style=\"width:6.9pt;height:22.2pt;vertical-align:-22.2pt;\"><span class=\"ltx_transformed_inner\" style=\"width:22.2pt;transform:translate(-7.64pt,-14.58pt) rotate(90deg) ;\">\n<p class=\"ltx_p\" id=\"A3.F5.1.pic1.10.10.10.10.1.1.1\">Click</p>\n</span></div></foreignobject></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 97.63 73.45)\"><foreignobject height=\"26.33\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"9.46\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"A3.F5.1.pic1.11.11.11.11.1.1\" style=\"width:6.8pt;height:19pt;vertical-align:-0.0pt;\"><span class=\"ltx_transformed_inner\" style=\"width:19.0pt;transform:translate(-6.1pt,-6.1pt) rotate(-90deg) ;\">\n<p class=\"ltx_p\" id=\"A3.F5.1.pic1.11.11.11.11.1.1.1\">Rec.</p>\n</span></div></foreignobject></g><g fill=\"#000000\" stroke=\"#000000\" transform=\"matrix(1.0 0.0 0.0 1.0 176.37 99.78)\"><foreignobject height=\"26.33\" overflow=\"visible\" transform=\"matrix(1 0 0 -1 0 16.6)\" width=\"9.46\">\n<div class=\"ltx_inline-block ltx_transformed_outer\" id=\"A3.F5.1.pic1.12.12.12.12.1.1\" style=\"width:6.8pt;height:19pt;vertical-align:-19.0pt;\"><span class=\"ltx_transformed_inner\" style=\"width:19.0pt;transform:translate(-6.1pt,-12.93pt) rotate(90deg) ;\">\n<p class=\"ltx_p\" id=\"A3.F5.1.pic1.12.12.12.12.1.1.1\">Rec.</p>\n</span></div></foreignobject></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 63.45 137.8 L 47.19 137.8\" style=\"fill:none\"></path><g transform=\"matrix(-1.0 0.0 0.0 -1.0 47.19 137.8)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 125.53 137.8 L 153.62 137.8\" style=\"fill:none\"></path><g transform=\"matrix(1.0 0.0 0.0 1.0 153.62 137.8)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 188.98 168.84 L 188.98 203.68\" style=\"fill:none\"></path><g transform=\"matrix(0.0 1.0 -1.0 0.0 188.98 203.68)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 0 94.92 L 0 73.94\" style=\"fill:none\"></path><g transform=\"matrix(0.0 -1.0 1.0 0.0 0 73.94)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 0 204.14 L 0 184.98\" style=\"fill:none\"></path><g transform=\"matrix(0.0 -1.0 1.0 0.0 0 184.98)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 32.08 236.22 L 48.11 236.22\" style=\"fill:none\"></path><g transform=\"matrix(1.0 0.0 0.0 1.0 48.11 236.22)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 94.49 168.84 L 94.49 203.68\" style=\"fill:none\"></path><g transform=\"matrix(0.0 1.0 -1.0 0.0 94.49 203.68)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 251.39 236.22 L 235.36 236.22\" style=\"fill:none\"></path><g transform=\"matrix(-1.0 0.0 0.0 -1.0 235.36 236.22)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 240.59 137.8 L 224.34 137.8\" style=\"fill:none\"></path><g transform=\"matrix(-1.0 0.0 0.0 -1.0 224.34 137.8)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 283.46 94.92 L 283.46 73.94\" style=\"fill:none\"></path><g transform=\"matrix(0.0 -1.0 1.0 0.0 283.46 73.94)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 229.91 27.56 L 226.61 27.56\" style=\"fill:none\"></path><g transform=\"matrix(1.0 0.0 0.0 1.0 226.61 27.56)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 211.86 116.82 L 267.05 66.24\" style=\"fill:none\"></path><g transform=\"matrix(0.73721 -0.67567 0.67567 0.73721 267.05 66.24)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 71.6 116.82 L 16.41 66.24\" style=\"fill:none\"></path><g transform=\"matrix(-0.73721 -0.67567 0.67567 -0.73721 16.41 66.24)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 188.98 68.49 L 188.98 102.44\" style=\"fill:none\"></path><g transform=\"matrix(0.0 1.0 -1.0 0.0 188.98 102.44)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 94.49 68.49 L 94.49 102.44\" style=\"fill:none\"></path><g transform=\"matrix(0.0 1.0 -1.0 0.0 94.49 102.44)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 53.55 27.56 L 51.68 27.56\" style=\"fill:none\"></path><g transform=\"matrix(1.0 0.0 0.0 1.0 51.68 27.56)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g><g color=\"#000000\" fill=\"#000000\" stroke=\"#000000\" stroke-width=\"0.8pt\"><path d=\"M 283.46 204.14 L 283.46 184.98\" style=\"fill:none\"></path><g transform=\"matrix(0.0 -1.0 1.0 0.0 283.46 184.98)\"><path d=\"M 4.32 0 C 2.52 0.36 -0.72 1.08 -2.88 2.7 C -1.08 0.72 -1.08 -0.72 -2.88 -2.7 C -0.72 -1.08 2.52 -0.36 4.32 0 Z\" style=\"stroke:none\"></path></g></g></g></svg>\n</span></div>\n<figcaption class=\"ltx_caption\"><span class=\"ltx_tag ltx_tag_figure\"><span class=\"ltx_text\" id=\"A3.F5.3.1.1\" style=\"font-size:90%;\">Figure 5</span>. </span><span class=\"ltx_text\" id=\"A3.F5.4.2\" style=\"font-size:90%;\">A real-life scenario for the job recommendation.</span></figcaption>\n</figure>\n<div class=\"ltx_para\" id=\"A3.SS3.p1\">\n<p class=\"ltx_p\" id=\"A3.SS3.p1.1\">Beyond its effectiveness in performance, BISTRO also boasts considerable interpretability.\nTo demonstrate how the framework mitigates both the job preference drift and data noise problems, we present a real-life scenario to illustrate the logic behind the suggestions made by BISTRO, as shown in Figure <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#A3.F5\" title=\"Figure 5 ‣ C.3. Case Study ‣ C.2. The number of recommended jobs ‣ Appendix C Experiment Detail ‣ Appendix B Model Complexity ‣ Appendix A Notations ‣ 6. Conclusion ‣ 5.5. Parametric Study (RQ4) ‣ 5.4. Ablation Study (RQ2, RQ3) ‣ 5.3. Overall Performance (RQ1) ‣ 5.2. Experimental Settings ‣ 5.1. Dataset ‣ 5. Experiments ‣ Adapting Job Recommendations to User Preference Drift with Behavioral-Semantic Fusion Learning\">5 ###reference_### ###reference_### ###reference_### ###reference_### ###reference_###</a>.</p>\n</div>\n<div class=\"ltx_para\" id=\"A3.SS3.p2\">\n<p class=\"ltx_p\" id=\"A3.SS3.p2.1\">In this figure, jobs with IDs ***872 and ***994 are two job positions that are newly posted in the online recruitment system, while IDs ***265 and ***523 are two job positions that a large number of users interact with frequently.\nAmong them, ***872 and ***265, as well as ***994 and ***523, have similar occupational demand descriptions respectively.\nAlso, the user with ID ***175 shared a similar resume with user ID ***479 before ***175 modified the resume, and after his resume was changed, ***175 had a similar content with user ***013.\nRecommendations in this scenario can be divided into three examples:</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS3.p3\">\n<p class=\"ltx_p\" id=\"A3.SS3.p3.1\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS3.p3.1.1\">Example 1</span> (Recommendation for a dynamically changing user) Consider the user represented by ID ***175, BISTRO addresses this challenge by deploying content-based analysis.\nThe framework utilizes the user’s social network and a set of resume attributes collected to create a composite feature profile to identify users with similar tastes.\nSubsequently, it recommends a job with ID ***523 favored by a like-minded user with ID ***013 to him.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS3.p4\">\n<p class=\"ltx_p\" id=\"A3.SS3.p4.1\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS3.p4.1.1\">Example 2</span> (Recommendation for a new job) A newly posted job with ID ***872 lacks any user interaction data, complicating the generation of a meaningful representation for it.\nBISTRO, however, overcomes this by incorporating auxiliary information such as skill requirements and working experience, and then associated tags to locate similar content.\nBy leveraging this approach combined with the user’s expectations, BISTRO acquires a rich and informative embedding for the job, enabling it to recommend the job to users who have shown an interest in comparable jobs.</p>\n</div>\n<div class=\"ltx_para ltx_noindent\" id=\"A3.SS3.p5\">\n<p class=\"ltx_p\" id=\"A3.SS3.p5.1\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS3.p5.1.1\">Example 3</span> (Recommend a new job to a dynamically changing user) Combining both two situations illustrated above, BISTRO deals with this complex challenge by utilizing a wavelet graph denoising filter and graph representation method.\nIn this way, it can recommend the latest jobs with similar job content to users with the same real-time needs as well as similar user content characteristics.</p>\n</div>\n<figure class=\"ltx_table\" id=\"A3.SS3.tab1\">\n<figcaption class=\"ltx_caption\"><span class=\"ltx_tag ltx_tag_table\"><span class=\"ltx_text\" id=\"A3.SS3.tab1.1.1.1\" style=\"font-size:90%;\">Table 8</span>. </span><span class=\"ltx_text\" id=\"A3.SS3.tab1.2.2\" style=\"font-size:90%;\">Results of different job recommender systems.</span></figcaption><div class=\"ltx_flex_figure\">\n<div class=\"ltx_flex_cell ltx_flex_size_1\"><span class=\"ltx_ERROR ltx_figure_panel undefined\" id=\"A3.SS3.tab1.3\">\\csvreader</span></div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS3.tab1.4\">[\ntabular=ccccccc,\ntable head=</p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS3.tab1.5\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS3.tab1.5.1\">Shenzhen Shanghai Beijing</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS3.tab1.6\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS3.tab1.6.1\">H@10 M@10 H@10 M@10 H@10 M@10</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS3.tab1.7\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS3.tab1.7.1\">,\nlate after line=\n<br class=\"ltx_break\"/>,\nlate after last line= \n<br class=\"ltx_break\"/>]data/app_case.txt</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\"><span class=\"ltx_ERROR ltx_figure_panel undefined\" id=\"A3.SS3.tab1.8\">\\csvlinetotablerow</span></div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS3.tab1.9\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS3.tab1.9.1\" style=\"font-size:90%;\">Bold indicates the statistically significant improvements \n<br class=\"ltx_break\"/>(<em class=\"ltx_emph ltx_font_italic\" id=\"A3.SS3.tab1.9.1.1\">i.e.,</em> two-sided t-test with p ¡ 0.05) over the best baseline (underlined).</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS3.tab1.10\"><span class=\"ltx_text ltx_font_bold\" id=\"A3.SS3.tab1.10.1\" style=\"font-size:90%;\">For all metrics: the higher, the better.</span></p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<p class=\"ltx_p ltx_figure_panel\" id=\"A3.SS3.tab1.11\">In addition, we also compare the proposed framework, BISTRO, with multiple job state-of-the-art recommender systems, <em class=\"ltx_emph ltx_font_italic\" id=\"A3.SS3.tab1.11.1\">i.e.,</em> InEXIT <cite class=\"ltx_cite ltx_citemacro_citep\">(Shao et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib40\" title=\"\">2023 ###reference_b40### ###reference_b40### ###reference_b40### ###reference_b40### ###reference_b40###</a>)</cite>, DGMN <cite class=\"ltx_cite ltx_citemacro_citep\">(Bian et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib2\" title=\"\">2019 ###reference_b2### ###reference_b2### ###reference_b2### ###reference_b2### ###reference_b2###</a>)</cite>, and APJFMF <cite class=\"ltx_cite ltx_citemacro_citep\">(Jian et al<span class=\"ltx_text\">.</span>, <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#bib.bib18\" title=\"\">2024 ###reference_b18### ###reference_b18### ###reference_b18### ###reference_b18### ###reference_b18###</a>)</cite>.\nThe result can be found in Table <a class=\"ltx_ref\" href=\"https://arxiv.org/html/2407.00082v1#A3.SS3\" title=\"C.3. Case Study ‣ C.2. The number of recommended jobs ‣ Appendix C Experiment Detail ‣ Appendix B Model Complexity ‣ Appendix A Notations ‣ 6. Conclusion ‣ 5.5. Parametric Study (RQ4) ‣ 5.4. Ablation Study (RQ2, RQ3) ‣ 5.3. Overall Performance (RQ1) ‣ 5.2. Experimental Settings ‣ 5.1. Dataset ‣ 5. Experiments ‣ Adapting Job Recommendations to User Preference Drift with Behavioral-Semantic Fusion Learning\">C.3 ###reference_### ###reference_### ###reference_### ###reference_### ###reference_###</a>.\nWe can see that our framework acheves the best among all baselines, which verify the effectiveness of our method.</p>\n</div>\n<div class=\"ltx_flex_break\"></div>\n<div class=\"ltx_flex_cell ltx_flex_size_1\">\n<div class=\"ltx_pagination ltx_figure_panel ltx_role_newpage\"></div>\n</div>\n</div>\n</figure>\n</section>\n</div>\n</div>\n</figure>\n</section>\n</section>\n</section>\n</div>\n</div>\n</figure>\n</section>\n</section>\n</section>\n</div>\n</div>\n</figure>", | |
"perturb_sentence_id": [ | |
7, | |
8 | |
], | |
"output": { | |
"perturbed_statement": "[paragraph id = 7] [ tabular=cccc, table head= Density H@5 M@10 , late after line= , late after last line= ]data/exp_hyperedge.txt\\csvlinetotablerow Non-Noise Add 10% Noise Add 20% Noise Add 50% Noise The validity of hypergraph wavelet filter In BISTRO, we design a novel hypergraph wavelet learning method.In this learning method, a wavelet filter is deployed for data denoising as well as fine-grained job preference feature extraction.", | |
"perturbed_explanation": "The comparison results are shown in Table 5.4 and the experimental results are improved through optimizing the graph structure. As shown in Table 5.4, the graph density doubles with the addition of all hyperedges, which improves the experimental results. In the statement, replacing H@10 with H@5 introduces an inconsistency because there's no mention of H@5 in the context, and the change in density doubling as described would correspond with H@10, not H@5." | |
} | |
} | |
] |