Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attr_ loss is a negative value #53

Open
feifeifei12 opened this issue Sep 12, 2023 · 0 comments
Open

attr_ loss is a negative value #53

feifeifei12 opened this issue Sep 12, 2023 · 0 comments

Comments

@feifeifei12
Copy link

Thank you for your work on GPT-GNN. I am very interested in this article and I am currently replicating your code. When I run your file "example_ reddit/pretrain_ reddit. py ", train_loss consists of two parts, link_ Loss and attr_ Loss. However, attr_ loss is a negative value, and during training, attr_ loss is getting smaller and smaller, as shown below.

Epoch: 1, (1 / 19) 31.7s LR: 0.00088 Train Loss: (5.198, -1.370) Valid Loss: (4.632, -2.647) NDCG: 0.331 Norm: 3.278 queue: 85
UPDATE!!!
Data Preparation: 2.2s
Epoch: 1, (2 / 19) 22.3s LR: 0.00099 Train Loss: (4.473, -2.519) Valid Loss: (4.257, -3.451) NDCG: 0.398 Norm: 3.231 queue: 85
UPDATE!!!
Data Preparation: 2.4s
Epoch: 1, (3 / 19) 23.9s LR: 0.00097 Train Loss: (4.220, -2.910) Valid Loss: (4.169, -3.735) NDCG: 0.421 Norm: 4.161 queue: 85
UPDATE!!!
Data Preparation: 2.4s
Epoch: 1, (4 / 19) 25.1s LR: 0.00095 Train Loss: (4.041, -3.255) Valid Loss: (3.664, -4.088) NDCG: 0.529 Norm: 5.624 queue: 85
UPDATE!!!
Data Preparation: 2.5s
Epoch: 1, (5 / 19) 23.1s LR: 0.00093 Train Loss: (3.916, -3.580) Valid Loss: (3.703, -4.224) NDCG: 0.489 Norm: 5.439 queue: 85
UPDATE!!!
Data Preparation: 2.3s
Epoch: 1, (6 / 19) 21.1s LR: 0.00092 Train Loss: (3.839, -3.743) Valid Loss: (3.718, -4.373) NDCG: 0.499 Norm: 7.101 queue: 85
UPDATE!!!
Data Preparation: 5.7s
Epoch: 1, (7 / 19) 21.5s LR: 0.00090 Train Loss: (3.795, -3.839) Valid Loss: (3.806, -4.290) NDCG: 0.481 Norm: 6.792 queue: 85
Data Preparation: 2.9s
Epoch: 1, (8 / 19) 21.5s LR: 0.00088 Train Loss: (3.764, -3.905) Valid Loss: (3.626, -4.551) NDCG: 0.517 Norm: 7.914 queue: 85
UPDATE!!!
Data Preparation: 3.4s
Epoch: 1, (9 / 19) 21.4s LR: 0.00086 Train Loss: (3.790, -4.006) Valid Loss: (3.527, -4.650) NDCG: 0.518 Norm: 8.545 queue: 85
UPDATE!!!
Data Preparation: 4.9s
Epoch: 1, (10 / 19) 21.5s LR: 0.00085 Train Loss: (3.789, -4.053) Valid Loss: (3.567, -4.608) NDCG: 0.519 Norm: 7.481 queue: 85
Data Preparation: 4.1s
Epoch: 1, (11 / 19) 20.7s LR: 0.00083 Train Loss: (3.667, -4.089) Valid Loss: (3.434, -4.530) NDCG: 0.543 Norm: 8.159 queue: 85
Data Preparation: 4.8s
Epoch: 1, (12 / 19) 21.8s LR: 0.00081 Train Loss: (3.719, -4.171) Valid Loss: (3.316, -4.752) NDCG: 0.564 Norm: 8.720 queue: 85
UPDATE!!!
Data Preparation: 2.8s
Epoch: 1, (13 / 19) 20.7s LR: 0.00079 Train Loss: (3.656, -4.219) Valid Loss: (3.577, -4.913) NDCG: 0.521 Norm: 8.388 queue: 85
Data Preparation: 6.1s
Epoch: 1, (14 / 19) 25.3s LR: 0.00078 Train Loss: (3.630, -4.223) Valid Loss: (3.508, -4.865) NDCG: 0.527 Norm: 8.830 queue: 85
Data Preparation: 13.7s
Epoch: 1, (15 / 19) 22.7s LR: 0.00076 Train Loss: (3.666, -4.272) Valid Loss: (3.507, -4.825) NDCG: 0.530 Norm: 9.109 queue: 85
Data Preparation: 2.5s
Epoch: 1, (16 / 19) 22.2s LR: 0.00074 Train Loss: (3.648, -4.333) Valid Loss: (3.511, -5.162) NDCG: 0.543 Norm: 8.926 queue: 85
UPDATE!!!
Data Preparation: 2.9s
Epoch: 1, (17 / 19) 22.2s LR: 0.00073 Train Loss: (3.608, -4.351) Valid Loss: (3.555, -4.808) NDCG: 0.515 Norm: 9.351 queue: 85
Data Preparation: 2.4s
Epoch: 1, (18 / 19) 22.1s LR: 0.00071 Train Loss: (3.621, -4.367) Valid Loss: (3.386, -5.186) NDCG: 0.555 Norm: 9.608 queue: 85
UPDATE!!!

Apart from reducing sample_depth and sample_width to accommodate memory, I have not made any other modifications to your code. May I ask the change of attr_ Ioss is normal? Looking forward to your answer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant