forked from openai/openai-python
-
Notifications
You must be signed in to change notification settings - Fork 0
/
debug.log
7613 lines (7428 loc) · 924 KB
/
debug.log
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
2024-12-09 23:14:53,393 [DEBUG] Starting new HTTPS connection (1): pypi.org:443
2024-12-09 23:14:53,455 [DEBUG] https://pypi.org:443 "GET /pypi/praw/json HTTP/11" 200 36290
2024-12-09 23:14:53,465 [INFO] Successfully initialized Reddit API.
2024-12-09 23:14:53,466 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733804093.4663098
2024-12-09 23:14:53,466 [DEBUG] Data: None
2024-12-09 23:14:53,466 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-09 23:14:53,468 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-09 23:14:53,570 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 656
2024-12-09 23:14:53,573 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-09 23:14:54,305 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 52045
2024-12-09 23:14:54,323 [DEBUG] Response: 200 (52045 bytes) (rst-306:rem-999.0:used-1 ratelimit) at 1733804094.3233569
2024-12-09 23:14:54,335 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733804094.335097
2024-12-09 23:14:54,335 [DEBUG] Data: None
2024-12-09 23:14:54,335 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:54,468 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3187
2024-12-09 23:14:54,469 [DEBUG] Response: 200 (3187 bytes) (rst-305:rem-998.0:used-2 ratelimit) at 1733804094.4695559
2024-12-09 23:14:54,471 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733804094.471811
2024-12-09 23:14:54,471 [DEBUG] Data: None
2024-12-09 23:14:54,472 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:54,602 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3408
2024-12-09 23:14:54,605 [DEBUG] Response: 200 (3408 bytes) (rst-305:rem-997.0:used-3 ratelimit) at 1733804094.605022
2024-12-09 23:14:54,607 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9sr7i/ at 1733804094.607161
2024-12-09 23:14:54,607 [DEBUG] Data: None
2024-12-09 23:14:54,607 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:54,757 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9sr7i/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3899
2024-12-09 23:14:54,758 [DEBUG] Response: 200 (3899 bytes) (rst-305:rem-996.0:used-4 ratelimit) at 1733804094.75892
2024-12-09 23:14:54,760 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733804094.76074
2024-12-09 23:14:54,760 [DEBUG] Data: None
2024-12-09 23:14:54,760 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:55,120 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4230
2024-12-09 23:14:55,121 [DEBUG] Response: 200 (4230 bytes) (rst-305:rem-995.0:used-5 ratelimit) at 1733804095.121963
2024-12-09 23:14:55,123 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733804095.1236842
2024-12-09 23:14:55,123 [DEBUG] Data: None
2024-12-09 23:14:55,123 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:55,275 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6631
2024-12-09 23:14:55,277 [DEBUG] Response: 200 (6631 bytes) (rst-304:rem-994.0:used-6 ratelimit) at 1733804095.277109
2024-12-09 23:14:55,279 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h93w33/ at 1733804095.279425
2024-12-09 23:14:55,279 [DEBUG] Data: None
2024-12-09 23:14:55,279 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:55,471 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h93w33/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 5247
2024-12-09 23:14:55,473 [DEBUG] Response: 200 (5247 bytes) (rst-304:rem-993.0:used-7 ratelimit) at 1733804095.4729862
2024-12-09 23:14:55,475 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8p7ym/ at 1733804095.475589
2024-12-09 23:14:55,475 [DEBUG] Data: None
2024-12-09 23:14:55,475 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:55,651 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8p7ym/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 7120
2024-12-09 23:14:55,653 [DEBUG] Response: 200 (7120 bytes) (rst-304:rem-992.0:used-8 ratelimit) at 1733804095.653257
2024-12-09 23:14:55,656 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8lxr0/ at 1733804095.656119
2024-12-09 23:14:55,656 [DEBUG] Data: None
2024-12-09 23:14:55,656 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:55,800 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8lxr0/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3397
2024-12-09 23:14:55,801 [DEBUG] Response: 200 (3397 bytes) (rst-304:rem-991.0:used-9 ratelimit) at 1733804095.801724
2024-12-09 23:14:55,802 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8ktl3/ at 1733804095.8029828
2024-12-09 23:14:55,803 [DEBUG] Data: None
2024-12-09 23:14:55,803 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:14:56,144 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8ktl3/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 7672
2024-12-09 23:14:56,146 [DEBUG] Response: 200 (7672 bytes) (rst-304:rem-990.0:used-10 ratelimit) at 1733804096.146434
2024-12-09 23:14:56,150 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733804096.15087
2024-12-09 23:14:56,151 [DEBUG] Data: None
2024-12-09 23:14:56,151 [DEBUG] Params: {'after': 't3_1h3s6y6', 'limit': 200, 'raw_json': 1}
2024-12-09 23:14:56,963 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h3s6y6&raw_json=1 HTTP/11" 200 51688
2024-12-09 23:14:56,965 [DEBUG] Response: 200 (51688 bytes) (rst-303:rem-989.0:used-11 ratelimit) at 1733804096.9659688
2024-12-09 23:14:56,978 [INFO] Fetched 9 posts after filtering.
2024-12-09 23:14:56,992 [ERROR] Error filtering posts with GPT-3.5.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_good_posts_with_gpt35
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-09 23:14:56,995 [ERROR] Error in run_script
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_good_posts_with_gpt35
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 189, in run_script
good_posts = filter_good_posts_with_gpt35(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 131, in filter_good_posts_with_gpt35
raise RuntimeError(f"Error filtering posts with GPT-3.5: {e}")
RuntimeError: Error filtering posts with GPT-3.5:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-09 23:18:44,569 [INFO] Successfully initialized Reddit API.
2024-12-09 23:18:44,570 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733804324.570518
2024-12-09 23:18:44,570 [DEBUG] Data: None
2024-12-09 23:18:44,570 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-09 23:18:44,574 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-09 23:18:44,906 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 655
2024-12-09 23:18:44,912 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-09 23:18:45,933 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 52059
2024-12-09 23:18:45,998 [DEBUG] Response: 200 (52059 bytes) (rst-74:rem-988.0:used-12 ratelimit) at 1733804325.998317
2024-12-09 23:18:46,005 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733804326.005583
2024-12-09 23:18:46,005 [DEBUG] Data: None
2024-12-09 23:18:46,005 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:18:46,248 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3188
2024-12-09 23:18:46,250 [DEBUG] Response: 200 (3188 bytes) (rst-73:rem-987.0:used-13 ratelimit) at 1733804326.2504792
2024-12-09 23:18:46,252 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733804326.25244
2024-12-09 23:18:46,252 [DEBUG] Data: None
2024-12-09 23:18:46,252 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:18:46,482 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3407
2024-12-09 23:18:46,483 [DEBUG] Response: 200 (3407 bytes) (rst-73:rem-986.0:used-14 ratelimit) at 1733804326.4837031
2024-12-09 23:18:46,485 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733804326.485423
2024-12-09 23:18:46,485 [DEBUG] Data: None
2024-12-09 23:18:46,485 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:18:46,719 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4231
2024-12-09 23:18:46,721 [DEBUG] Response: 200 (4231 bytes) (rst-73:rem-985.0:used-15 ratelimit) at 1733804326.721081
2024-12-09 23:18:46,722 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733804326.722373
2024-12-09 23:18:46,722 [DEBUG] Data: None
2024-12-09 23:18:46,722 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:18:46,970 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6629
2024-12-09 23:18:46,971 [DEBUG] Response: 200 (6629 bytes) (rst-73:rem-984.0:used-16 ratelimit) at 1733804326.971384
2024-12-09 23:18:46,973 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h93w33/ at 1733804326.973284
2024-12-09 23:18:46,973 [DEBUG] Data: None
2024-12-09 23:18:46,973 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:18:47,364 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h93w33/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 5243
2024-12-09 23:18:47,365 [DEBUG] Response: 200 (5243 bytes) (rst-72:rem-983.0:used-17 ratelimit) at 1733804327.365511
2024-12-09 23:18:47,367 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8p7ym/ at 1733804327.367786
2024-12-09 23:18:47,367 [DEBUG] Data: None
2024-12-09 23:18:47,367 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:18:47,624 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8p7ym/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 7120
2024-12-09 23:18:47,626 [DEBUG] Response: 200 (7120 bytes) (rst-72:rem-982.0:used-18 ratelimit) at 1733804327.626119
2024-12-09 23:18:47,628 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8lxr0/ at 1733804327.628983
2024-12-09 23:18:47,629 [DEBUG] Data: None
2024-12-09 23:18:47,629 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:18:47,855 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8lxr0/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3399
2024-12-09 23:18:47,856 [DEBUG] Response: 200 (3399 bytes) (rst-72:rem-981.0:used-19 ratelimit) at 1733804327.856444
2024-12-09 23:18:47,858 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8ktl3/ at 1733804327.8585
2024-12-09 23:18:47,858 [DEBUG] Data: None
2024-12-09 23:18:47,858 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:18:48,285 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8ktl3/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 7665
2024-12-09 23:18:48,285 [DEBUG] Response: 200 (7665 bytes) (rst-71:rem-980.0:used-20 ratelimit) at 1733804328.285377
2024-12-09 23:18:48,286 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733804328.286581
2024-12-09 23:18:48,286 [DEBUG] Data: None
2024-12-09 23:18:48,286 [DEBUG] Params: {'after': 't3_1h3s6y6', 'limit': 200, 'raw_json': 1}
2024-12-09 23:18:49,176 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h3s6y6&raw_json=1 HTTP/11" 200 51697
2024-12-09 23:18:49,242 [DEBUG] Response: 200 (51697 bytes) (rst-71:rem-979.0:used-21 ratelimit) at 1733804329.2426171
2024-12-09 23:18:49,254 [INFO] Fetched 8 posts after filtering.
2024-12-09 23:18:49,265 [ERROR] Error filtering posts with GPT-3.5.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_good_posts_with_gpt35
response = openai.Chat.create(
^^^^^^^^^^^
AttributeError: module 'openai' has no attribute 'Chat'. Did you mean: 'chat'?
2024-12-09 23:18:49,266 [ERROR] Error in run_script
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_good_posts_with_gpt35
response = openai.Chat.create(
^^^^^^^^^^^
AttributeError: module 'openai' has no attribute 'Chat'. Did you mean: 'chat'?
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 189, in run_script
good_posts = filter_good_posts_with_gpt35(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 131, in filter_good_posts_with_gpt35
raise RuntimeError(f"Error filtering posts with GPT-3.5: {e}")
RuntimeError: Error filtering posts with GPT-3.5: module 'openai' has no attribute 'Chat'
2024-12-09 23:40:52,504 [INFO] Successfully initialized Reddit API.
2024-12-09 23:40:52,505 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733805652.50528
2024-12-09 23:40:52,505 [DEBUG] Data: None
2024-12-09 23:40:52,505 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-09 23:40:52,508 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-09 23:40:52,619 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 651
2024-12-09 23:40:52,625 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-09 23:40:53,474 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 52101
2024-12-09 23:40:53,490 [DEBUG] Response: 200 (52101 bytes) (rst-547:rem-999.0:used-1 ratelimit) at 1733805653.490626
2024-12-09 23:40:53,503 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733805653.503911
2024-12-09 23:40:53,504 [DEBUG] Data: None
2024-12-09 23:40:53,504 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:40:53,651 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3188
2024-12-09 23:40:53,652 [DEBUG] Response: 200 (3188 bytes) (rst-546:rem-998.0:used-2 ratelimit) at 1733805653.652489
2024-12-09 23:40:53,653 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733805653.653036
2024-12-09 23:40:53,653 [DEBUG] Data: None
2024-12-09 23:40:53,653 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:40:53,784 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3408
2024-12-09 23:40:53,785 [DEBUG] Response: 200 (3408 bytes) (rst-546:rem-997.0:used-3 ratelimit) at 1733805653.7859242
2024-12-09 23:40:53,788 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733805653.787992
2024-12-09 23:40:53,788 [DEBUG] Data: None
2024-12-09 23:40:53,788 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:40:53,923 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4228
2024-12-09 23:40:53,925 [DEBUG] Response: 200 (4228 bytes) (rst-546:rem-996.0:used-4 ratelimit) at 1733805653.9251451
2024-12-09 23:40:53,926 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733805653.926789
2024-12-09 23:40:53,926 [DEBUG] Data: None
2024-12-09 23:40:53,927 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:40:54,090 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6631
2024-12-09 23:40:54,091 [DEBUG] Response: 200 (6631 bytes) (rst-545:rem-995.0:used-5 ratelimit) at 1733805654.091498
2024-12-09 23:40:54,093 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h93w33/ at 1733805654.093651
2024-12-09 23:40:54,093 [DEBUG] Data: None
2024-12-09 23:40:54,093 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:40:54,289 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h93w33/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 5242
2024-12-09 23:40:54,290 [DEBUG] Response: 200 (5242 bytes) (rst-545:rem-994.0:used-6 ratelimit) at 1733805654.290751
2024-12-09 23:40:54,293 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8p7ym/ at 1733805654.293822
2024-12-09 23:40:54,294 [DEBUG] Data: None
2024-12-09 23:40:54,294 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:40:54,475 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8p7ym/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 7119
2024-12-09 23:40:54,477 [DEBUG] Response: 200 (7119 bytes) (rst-545:rem-993.0:used-7 ratelimit) at 1733805654.477377
2024-12-09 23:40:54,480 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8lxr0/ at 1733805654.4808161
2024-12-09 23:40:54,481 [DEBUG] Data: None
2024-12-09 23:40:54,481 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:40:54,615 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8lxr0/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3400
2024-12-09 23:40:54,617 [DEBUG] Response: 200 (3400 bytes) (rst-545:rem-992.0:used-8 ratelimit) at 1733805654.617085
2024-12-09 23:40:54,618 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8ktl3/ at 1733805654.6189592
2024-12-09 23:40:54,619 [DEBUG] Data: None
2024-12-09 23:40:54,619 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:40:54,904 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8ktl3/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 7667
2024-12-09 23:40:54,905 [DEBUG] Response: 200 (7667 bytes) (rst-545:rem-991.0:used-9 ratelimit) at 1733805654.9057899
2024-12-09 23:40:54,909 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733805654.909388
2024-12-09 23:40:54,909 [DEBUG] Data: None
2024-12-09 23:40:54,909 [DEBUG] Params: {'after': 't3_1h3s6y6', 'limit': 200, 'raw_json': 1}
2024-12-09 23:40:55,819 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h3s6y6&raw_json=1 HTTP/11" 200 51729
2024-12-09 23:40:55,823 [DEBUG] Response: 200 (51729 bytes) (rst-545:rem-990.0:used-10 ratelimit) at 1733805655.823154
2024-12-09 23:40:55,836 [INFO] Fetched 8 posts after filtering.
2024-12-09 23:40:55,850 [ERROR] Error filtering posts with GPT-3.5.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_good_posts_with_gpt35
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-09 23:40:55,852 [ERROR] Error in run_script
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_good_posts_with_gpt35
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 188, in run_script
good_posts = filter_good_posts_with_gpt35(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 130, in filter_good_posts_with_gpt35
raise RuntimeError(f"Error filtering posts with GPT-3.5: {e}")
RuntimeError: Error filtering posts with GPT-3.5:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-09 23:42:49,840 [INFO] Successfully initialized Reddit API.
2024-12-09 23:42:49,840 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733805769.840686
2024-12-09 23:42:49,840 [DEBUG] Data: None
2024-12-09 23:42:49,841 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-09 23:42:49,844 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-09 23:42:50,147 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 651
2024-12-09 23:42:50,151 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-09 23:42:51,132 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 52079
2024-12-09 23:42:51,200 [DEBUG] Response: 200 (52079 bytes) (rst-429:rem-989.0:used-11 ratelimit) at 1733805771.200772
2024-12-09 23:42:51,213 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733805771.213311
2024-12-09 23:42:51,213 [DEBUG] Data: None
2024-12-09 23:42:51,213 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:42:51,455 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3188
2024-12-09 23:42:51,457 [DEBUG] Response: 200 (3188 bytes) (rst-428:rem-988.0:used-12 ratelimit) at 1733805771.45728
2024-12-09 23:42:51,459 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1ha9il4/ at 1733805771.459326
2024-12-09 23:42:51,459 [DEBUG] Data: None
2024-12-09 23:42:51,459 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:42:51,678 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1ha9il4/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 2894
2024-12-09 23:42:51,679 [DEBUG] Response: 200 (2894 bytes) (rst-428:rem-987.0:used-13 ratelimit) at 1733805771.6797922
2024-12-09 23:42:51,681 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733805771.6813061
2024-12-09 23:42:51,681 [DEBUG] Data: None
2024-12-09 23:42:51,681 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:42:51,909 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3408
2024-12-09 23:42:51,910 [DEBUG] Response: 200 (3408 bytes) (rst-428:rem-986.0:used-14 ratelimit) at 1733805771.910948
2024-12-09 23:42:51,912 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9v7dz/ at 1733805771.912642
2024-12-09 23:42:51,912 [DEBUG] Data: None
2024-12-09 23:42:51,912 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:42:52,126 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9v7dz/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 2603
2024-12-09 23:42:52,128 [DEBUG] Response: 200 (2603 bytes) (rst-427:rem-985.0:used-15 ratelimit) at 1733805772.12834
2024-12-09 23:42:52,130 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733805772.130259
2024-12-09 23:42:52,130 [DEBUG] Data: None
2024-12-09 23:42:52,130 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:42:52,346 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4229
2024-12-09 23:42:52,348 [DEBUG] Response: 200 (4229 bytes) (rst-427:rem-984.0:used-16 ratelimit) at 1733805772.3482711
2024-12-09 23:42:52,349 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733805772.3496761
2024-12-09 23:42:52,349 [DEBUG] Data: None
2024-12-09 23:42:52,349 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:42:52,598 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6631
2024-12-09 23:42:52,599 [DEBUG] Response: 200 (6631 bytes) (rst-427:rem-983.0:used-17 ratelimit) at 1733805772.5997388
2024-12-09 23:42:52,602 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h93w33/ at 1733805772.6024299
2024-12-09 23:42:52,602 [DEBUG] Data: None
2024-12-09 23:42:52,602 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:42:52,974 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h93w33/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 5247
2024-12-09 23:42:52,976 [DEBUG] Response: 200 (5247 bytes) (rst-427:rem-982.0:used-18 ratelimit) at 1733805772.976397
2024-12-09 23:42:52,979 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h8p7ym/ at 1733805772.979136
2024-12-09 23:42:52,979 [DEBUG] Data: None
2024-12-09 23:42:52,979 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-09 23:42:53,247 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h8p7ym/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 7120
2024-12-09 23:42:53,249 [DEBUG] Response: 200 (7120 bytes) (rst-426:rem-981.0:used-19 ratelimit) at 1733805773.2495182
2024-12-09 23:42:53,252 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733805773.252785
2024-12-09 23:42:53,252 [DEBUG] Data: None
2024-12-09 23:42:53,253 [DEBUG] Params: {'after': 't3_1h3s6y6', 'limit': 200, 'raw_json': 1}
2024-12-09 23:42:54,303 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h3s6y6&raw_json=1 HTTP/11" 200 51711
2024-12-09 23:42:54,306 [DEBUG] Response: 200 (51711 bytes) (rst-426:rem-980.0:used-20 ratelimit) at 1733805774.306784
2024-12-09 23:42:54,319 [INFO] Fetched 8 posts after filtering.
2024-12-09 23:42:54,331 [ERROR] Error filtering posts with GPT-3.5.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_good_posts_with_gpt35
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-09 23:42:54,333 [ERROR] Error in run_script
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_good_posts_with_gpt35
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 188, in run_script
good_posts = filter_good_posts_with_gpt35(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 130, in filter_good_posts_with_gpt35
raise RuntimeError(f"Error filtering posts with GPT-3.5: {e}")
RuntimeError: Error filtering posts with GPT-3.5:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-10 20:55:42,532 [ERROR] An unexpected error occurred in the main GUI loop.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 162, in <module>
root = create_gui()
^^^^^^^^^^
NameError: name 'create_gui' is not defined
2024-12-10 21:01:05,089 [DEBUG] Starting new HTTPS connection (1): pypi.org:443
2024-12-10 21:01:05,179 [DEBUG] https://pypi.org:443 "GET /pypi/praw/json HTTP/11" 200 36290
2024-12-10 21:01:05,191 [INFO] Successfully connected to the Reddit API.
2024-12-10 21:01:05,192 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733882465.192451
2024-12-10 21:01:05,192 [DEBUG] Data: None
2024-12-10 21:01:05,192 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-10 21:01:05,193 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-10 21:01:05,725 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 656
2024-12-10 21:01:05,727 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-10 21:01:06,950 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 51596
2024-12-10 21:01:07,090 [DEBUG] Response: 200 (51596 bytes) (rst-534:rem-999.0:used-1 ratelimit) at 1733882467.090763
2024-12-10 21:01:07,103 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbej76/ at 1733882467.103761
2024-12-10 21:01:07,103 [DEBUG] Data: None
2024-12-10 21:01:07,104 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:07,352 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbej76/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3745
2024-12-10 21:01:07,354 [DEBUG] Response: 200 (3745 bytes) (rst-532:rem-998.0:used-2 ratelimit) at 1733882467.354164
2024-12-10 21:01:07,356 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbam3w/ at 1733882467.356151
2024-12-10 21:01:07,356 [DEBUG] Data: None
2024-12-10 21:01:07,357 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:07,608 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbam3w/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3487
2024-12-10 21:01:07,609 [DEBUG] Response: 200 (3487 bytes) (rst-532:rem-997.0:used-3 ratelimit) at 1733882467.6095328
2024-12-10 21:01:07,611 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hb3wd9/ at 1733882467.611068
2024-12-10 21:01:07,611 [DEBUG] Data: None
2024-12-10 21:01:07,611 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:07,906 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hb3wd9/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3655
2024-12-10 21:01:07,908 [DEBUG] Response: 200 (3655 bytes) (rst-532:rem-996.0:used-4 ratelimit) at 1733882467.908187
2024-12-10 21:01:07,908 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hazjmj/ at 1733882467.908871
2024-12-10 21:01:07,908 [DEBUG] Data: None
2024-12-10 21:01:07,908 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:08,245 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hazjmj/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3102
2024-12-10 21:01:08,247 [DEBUG] Response: 200 (3102 bytes) (rst-531:rem-995.0:used-5 ratelimit) at 1733882468.247828
2024-12-10 21:01:08,248 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1harj8o/ at 1733882468.248533
2024-12-10 21:01:08,248 [DEBUG] Data: None
2024-12-10 21:01:08,248 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:08,695 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1harj8o/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 8684
2024-12-10 21:01:08,695 [DEBUG] Response: 200 (8684 bytes) (rst-531:rem-994.0:used-6 ratelimit) at 1733882468.695842
2024-12-10 21:01:08,697 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1haooje/ at 1733882468.697721
2024-12-10 21:01:08,697 [DEBUG] Data: None
2024-12-10 21:01:08,697 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:09,205 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1haooje/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 2624
2024-12-10 21:01:09,206 [DEBUG] Response: 200 (2624 bytes) (rst-531:rem-993.0:used-7 ratelimit) at 1733882469.206076
2024-12-10 21:01:09,206 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733882469.206506
2024-12-10 21:01:09,206 [DEBUG] Data: None
2024-12-10 21:01:09,206 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:09,481 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4999
2024-12-10 21:01:09,484 [DEBUG] Response: 200 (4999 bytes) (rst-530:rem-992.0:used-8 ratelimit) at 1733882469.484673
2024-12-10 21:01:09,487 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1ha9il4/ at 1733882469.4871528
2024-12-10 21:01:09,487 [DEBUG] Data: None
2024-12-10 21:01:09,487 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:09,923 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1ha9il4/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3799
2024-12-10 21:01:09,924 [DEBUG] Response: 200 (3799 bytes) (rst-530:rem-991.0:used-9 ratelimit) at 1733882469.9247181
2024-12-10 21:01:09,926 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733882469.926177
2024-12-10 21:01:09,926 [DEBUG] Data: None
2024-12-10 21:01:09,926 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:10,334 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4200
2024-12-10 21:01:10,335 [DEBUG] Response: 200 (4200 bytes) (rst-529:rem-990.0:used-10 ratelimit) at 1733882470.335777
2024-12-10 21:01:10,337 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733882470.3378718
2024-12-10 21:01:10,337 [DEBUG] Data: None
2024-12-10 21:01:10,337 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:10,744 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4231
2024-12-10 21:01:10,746 [DEBUG] Response: 200 (4231 bytes) (rst-529:rem-989.0:used-11 ratelimit) at 1733882470.7460191
2024-12-10 21:01:10,747 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733882470.7472708
2024-12-10 21:01:10,747 [DEBUG] Data: None
2024-12-10 21:01:10,747 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:01:11,154 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6812
2024-12-10 21:01:11,154 [DEBUG] Response: 200 (6812 bytes) (rst-529:rem-988.0:used-12 ratelimit) at 1733882471.154832
2024-12-10 21:01:11,155 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733882471.155989
2024-12-10 21:01:11,156 [DEBUG] Data: None
2024-12-10 21:01:11,156 [DEBUG] Params: {'after': 't3_1h55mng', 'limit': 200, 'raw_json': 1}
2024-12-10 21:01:12,383 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h55mng&raw_json=1 HTTP/11" 200 50949
2024-12-10 21:01:12,708 [DEBUG] Response: 200 (50949 bytes) (rst-528:rem-987.0:used-13 ratelimit) at 1733882472.708129
2024-12-10 21:01:12,720 [INFO] Fetched and filtered 11 posts.
2024-12-10 21:01:12,720 [ERROR] Error filtering posts with GPT-3.5.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_posts_with_gpt
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-10 21:01:12,723 [ERROR] Error in script execution.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_posts_with_gpt
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 226, in run_script
good_posts = filter_posts_with_gpt(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 129, in filter_posts_with_gpt
raise RuntimeError(f"Error filtering posts with GPT-3.5: {e}")
RuntimeError: Error filtering posts with GPT-3.5:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-10 21:06:55,666 [INFO] Successfully connected to the Reddit API.
2024-12-10 21:06:55,667 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733882815.66705
2024-12-10 21:06:55,667 [DEBUG] Data: None
2024-12-10 21:06:55,667 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-10 21:06:55,670 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-10 21:06:55,848 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 656
2024-12-10 21:06:55,852 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-10 21:06:57,349 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 51601
2024-12-10 21:06:57,577 [DEBUG] Response: 200 (51601 bytes) (rst-184:rem-986.0:used-14 ratelimit) at 1733882817.5776188
2024-12-10 21:06:57,585 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbej76/ at 1733882817.5853581
2024-12-10 21:06:57,585 [DEBUG] Data: None
2024-12-10 21:06:57,585 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:06:57,981 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbej76/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3737
2024-12-10 21:06:57,982 [DEBUG] Response: 200 (3737 bytes) (rst-182:rem-985.0:used-15 ratelimit) at 1733882817.982956
2024-12-10 21:06:57,985 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbam3w/ at 1733882817.985564
2024-12-10 21:06:57,985 [DEBUG] Data: None
2024-12-10 21:06:57,985 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:06:58,389 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbam3w/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3487
2024-12-10 21:06:58,390 [DEBUG] Response: 200 (3487 bytes) (rst-181:rem-984.0:used-16 ratelimit) at 1733882818.390765
2024-12-10 21:06:58,392 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hb3wd9/ at 1733882818.391996
2024-12-10 21:06:58,392 [DEBUG] Data: None
2024-12-10 21:06:58,392 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:06:58,725 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hb3wd9/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3654
2024-12-10 21:06:58,726 [DEBUG] Response: 200 (3654 bytes) (rst-181:rem-983.0:used-17 ratelimit) at 1733882818.726681
2024-12-10 21:06:58,728 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hazjmj/ at 1733882818.728223
2024-12-10 21:06:58,728 [DEBUG] Data: None
2024-12-10 21:06:58,728 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:06:59,106 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hazjmj/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3104
2024-12-10 21:06:59,106 [DEBUG] Response: 200 (3104 bytes) (rst-181:rem-982.0:used-18 ratelimit) at 1733882819.106786
2024-12-10 21:06:59,107 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1harj8o/ at 1733882819.107737
2024-12-10 21:06:59,107 [DEBUG] Data: None
2024-12-10 21:06:59,107 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:06:59,516 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1harj8o/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 8677
2024-12-10 21:06:59,517 [DEBUG] Response: 200 (8677 bytes) (rst-180:rem-981.0:used-19 ratelimit) at 1733882819.517969
2024-12-10 21:06:59,519 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733882819.519975
2024-12-10 21:06:59,520 [DEBUG] Data: None
2024-12-10 21:06:59,520 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:06:59,927 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4999
2024-12-10 21:06:59,929 [DEBUG] Response: 200 (4999 bytes) (rst-180:rem-980.0:used-20 ratelimit) at 1733882819.9291031
2024-12-10 21:06:59,931 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1ha9il4/ at 1733882819.931595
2024-12-10 21:06:59,931 [DEBUG] Data: None
2024-12-10 21:06:59,931 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:07:00,439 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1ha9il4/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3798
2024-12-10 21:07:00,441 [DEBUG] Response: 200 (3798 bytes) (rst-180:rem-979.0:used-21 ratelimit) at 1733882820.4411361
2024-12-10 21:07:00,442 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733882820.442244
2024-12-10 21:07:00,442 [DEBUG] Data: None
2024-12-10 21:07:00,442 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:07:00,848 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4200
2024-12-10 21:07:00,849 [DEBUG] Response: 200 (4200 bytes) (rst-179:rem-978.0:used-22 ratelimit) at 1733882820.8493729
2024-12-10 21:07:00,852 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733882820.8523278
2024-12-10 21:07:00,852 [DEBUG] Data: None
2024-12-10 21:07:00,853 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:07:01,260 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4230
2024-12-10 21:07:01,261 [DEBUG] Response: 200 (4230 bytes) (rst-179:rem-977.0:used-23 ratelimit) at 1733882821.261472
2024-12-10 21:07:01,262 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733882821.262686
2024-12-10 21:07:01,262 [DEBUG] Data: None
2024-12-10 21:07:01,262 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:07:02,488 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6811
2024-12-10 21:07:02,489 [DEBUG] Response: 200 (6811 bytes) (rst-178:rem-976.0:used-24 ratelimit) at 1733882822.489868
2024-12-10 21:07:02,492 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733882822.492353
2024-12-10 21:07:02,492 [DEBUG] Data: None
2024-12-10 21:07:02,492 [DEBUG] Params: {'after': 't3_1h55mng', 'limit': 200, 'raw_json': 1}
2024-12-10 21:07:03,407 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h55mng&raw_json=1 HTTP/11" 200 50933
2024-12-10 21:07:03,571 [DEBUG] Response: 200 (50933 bytes) (rst-177:rem-975.0:used-25 ratelimit) at 1733882823.5717702
2024-12-10 21:07:03,585 [INFO] Fetched and filtered 10 posts.
2024-12-10 21:07:03,594 [ERROR] Error filtering posts with GPT-3.5.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_posts_with_gpt
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-10 21:07:03,596 [ERROR] Error in script execution.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_posts_with_gpt
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 227, in run_script
good_posts = filter_posts_with_gpt(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 129, in filter_posts_with_gpt
raise RuntimeError(f"Error filtering posts with GPT-3.5: {e}")
RuntimeError: Error filtering posts with GPT-3.5:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-10 21:18:53,429 [INFO] Successfully connected to the Reddit API.
2024-12-10 21:18:53,430 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733883533.4300852
2024-12-10 21:18:53,430 [DEBUG] Data: None
2024-12-10 21:18:53,430 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-10 21:18:53,432 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-10 21:18:53,750 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 652
2024-12-10 21:18:53,753 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-10 21:18:54,984 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 51620
2024-12-10 21:18:55,058 [DEBUG] Response: 200 (51620 bytes) (rst-65:rem-999.0:used-1 ratelimit) at 1733883535.058859
2024-12-10 21:18:55,068 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbej76/ at 1733883535.068109
2024-12-10 21:18:55,068 [DEBUG] Data: None
2024-12-10 21:18:55,068 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:55,336 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbej76/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3737
2024-12-10 21:18:55,337 [DEBUG] Response: 200 (3737 bytes) (rst-64:rem-998.0:used-2 ratelimit) at 1733883535.33725
2024-12-10 21:18:55,339 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbam3w/ at 1733883535.339644
2024-12-10 21:18:55,339 [DEBUG] Data: None
2024-12-10 21:18:55,339 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:55,589 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbam3w/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3487
2024-12-10 21:18:55,592 [DEBUG] Response: 200 (3487 bytes) (rst-64:rem-997.0:used-3 ratelimit) at 1733883535.5920808
2024-12-10 21:18:55,594 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hb3wd9/ at 1733883535.594342
2024-12-10 21:18:55,594 [DEBUG] Data: None
2024-12-10 21:18:55,594 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:55,856 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hb3wd9/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3656
2024-12-10 21:18:55,857 [DEBUG] Response: 200 (3656 bytes) (rst-64:rem-996.0:used-4 ratelimit) at 1733883535.857954
2024-12-10 21:18:55,860 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hazjmj/ at 1733883535.8603241
2024-12-10 21:18:55,860 [DEBUG] Data: None
2024-12-10 21:18:55,860 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:56,098 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hazjmj/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3103
2024-12-10 21:18:56,098 [DEBUG] Response: 200 (3103 bytes) (rst-64:rem-995.0:used-5 ratelimit) at 1733883536.098956
2024-12-10 21:18:56,099 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1harj8o/ at 1733883536.0999339
2024-12-10 21:18:56,100 [DEBUG] Data: None
2024-12-10 21:18:56,100 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:56,515 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1harj8o/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 8680
2024-12-10 21:18:56,516 [DEBUG] Response: 200 (8680 bytes) (rst-63:rem-994.0:used-6 ratelimit) at 1733883536.516513
2024-12-10 21:18:56,519 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1haooje/ at 1733883536.519514
2024-12-10 21:18:56,519 [DEBUG] Data: None
2024-12-10 21:18:56,519 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:56,769 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1haooje/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 2621
2024-12-10 21:18:56,770 [DEBUG] Response: 200 (2621 bytes) (rst-63:rem-993.0:used-7 ratelimit) at 1733883536.770638
2024-12-10 21:18:56,771 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733883536.7717981
2024-12-10 21:18:56,771 [DEBUG] Data: None
2024-12-10 21:18:56,771 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:57,037 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4996
2024-12-10 21:18:57,038 [DEBUG] Response: 200 (4996 bytes) (rst-63:rem-992.0:used-8 ratelimit) at 1733883537.038465
2024-12-10 21:18:57,041 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1ha9il4/ at 1733883537.041724
2024-12-10 21:18:57,041 [DEBUG] Data: None
2024-12-10 21:18:57,041 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:57,281 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1ha9il4/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3799
2024-12-10 21:18:57,438 [DEBUG] Response: 200 (3799 bytes) (rst-62:rem-991.0:used-9 ratelimit) at 1733883537.438797
2024-12-10 21:18:57,440 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733883537.440911
2024-12-10 21:18:57,441 [DEBUG] Data: None
2024-12-10 21:18:57,441 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:57,714 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4200
2024-12-10 21:18:57,716 [DEBUG] Response: 200 (4200 bytes) (rst-62:rem-990.0:used-10 ratelimit) at 1733883537.716476
2024-12-10 21:18:57,719 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733883537.71927
2024-12-10 21:18:57,719 [DEBUG] Data: None
2024-12-10 21:18:57,719 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:57,956 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4229
2024-12-10 21:18:57,958 [DEBUG] Response: 200 (4229 bytes) (rst-62:rem-989.0:used-11 ratelimit) at 1733883537.9584289
2024-12-10 21:18:57,960 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733883537.9601011
2024-12-10 21:18:57,960 [DEBUG] Data: None
2024-12-10 21:18:57,960 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:18:58,208 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6812
2024-12-10 21:18:58,210 [DEBUG] Response: 200 (6812 bytes) (rst-61:rem-988.0:used-12 ratelimit) at 1733883538.210007
2024-12-10 21:18:58,212 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733883538.2128682
2024-12-10 21:18:58,213 [DEBUG] Data: None
2024-12-10 21:18:58,213 [DEBUG] Params: {'after': 't3_1h55mng', 'limit': 200, 'raw_json': 1}
2024-12-10 21:18:59,445 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h55mng&raw_json=1 HTTP/11" 200 50977
2024-12-10 21:18:59,521 [DEBUG] Response: 200 (50977 bytes) (rst-61:rem-987.0:used-13 ratelimit) at 1733883539.5214999
2024-12-10 21:18:59,533 [INFO] Fetched and filtered 11 posts.
2024-12-10 21:18:59,567 [ERROR] Error filtering posts with GPT-3.5.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_posts_with_gpt
response = openai.chat.create(
^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/_utils/_proxy.py", line 23, in __getattr__
return getattr(proxied, attr)
AttributeError: 'Chat' object has no attribute 'create'
2024-12-10 21:18:59,568 [ERROR] Error in script execution.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 114, in filter_posts_with_gpt
response = openai.chat.create(
^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/_utils/_proxy.py", line 23, in __getattr__
return getattr(proxied, attr)
AttributeError: 'Chat' object has no attribute 'create'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 227, in run_script
good_posts = filter_posts_with_gpt(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 129, in filter_posts_with_gpt
raise RuntimeError(f"Error filtering posts with GPT-3.5: {e}")
RuntimeError: Error filtering posts with GPT-3.5: 'Chat' object has no attribute 'create'
2024-12-10 21:24:34,542 [INFO] Successfully connected to the Reddit API.
2024-12-10 21:24:34,543 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733883874.543204
2024-12-10 21:24:34,543 [DEBUG] Data: None
2024-12-10 21:24:34,543 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-10 21:24:34,546 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-10 21:24:35,655 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 654
2024-12-10 21:24:35,659 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-10 21:24:36,895 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 51639
2024-12-10 21:24:36,975 [DEBUG] Response: 200 (51639 bytes) (rst-324:rem-999.0:used-1 ratelimit) at 1733883876.9757628
2024-12-10 21:24:36,987 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbej76/ at 1733883876.987443
2024-12-10 21:24:36,987 [DEBUG] Data: None
2024-12-10 21:24:36,987 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:37,402 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbej76/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3745
2024-12-10 21:24:37,404 [DEBUG] Response: 200 (3745 bytes) (rst-322:rem-998.0:used-2 ratelimit) at 1733883877.404062
2024-12-10 21:24:37,406 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbam3w/ at 1733883877.406708
2024-12-10 21:24:37,406 [DEBUG] Data: None
2024-12-10 21:24:37,406 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:37,813 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbam3w/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3485
2024-12-10 21:24:37,814 [DEBUG] Response: 200 (3485 bytes) (rst-322:rem-997.0:used-3 ratelimit) at 1733883877.814446
2024-12-10 21:24:37,816 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hb3wd9/ at 1733883877.816637
2024-12-10 21:24:37,816 [DEBUG] Data: None
2024-12-10 21:24:37,816 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:38,223 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hb3wd9/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3665
2024-12-10 21:24:38,224 [DEBUG] Response: 200 (3665 bytes) (rst-322:rem-996.0:used-4 ratelimit) at 1733883878.2246988
2024-12-10 21:24:38,227 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hazjmj/ at 1733883878.227467
2024-12-10 21:24:38,227 [DEBUG] Data: None
2024-12-10 21:24:38,227 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:38,634 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hazjmj/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3109
2024-12-10 21:24:38,636 [DEBUG] Response: 200 (3109 bytes) (rst-321:rem-995.0:used-5 ratelimit) at 1733883878.6360261
2024-12-10 21:24:38,637 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1harj8o/ at 1733883878.637769
2024-12-10 21:24:38,637 [DEBUG] Data: None
2024-12-10 21:24:38,637 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:39,246 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1harj8o/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 8680
2024-12-10 21:24:39,247 [DEBUG] Response: 200 (8680 bytes) (rst-321:rem-994.0:used-6 ratelimit) at 1733883879.2477372
2024-12-10 21:24:39,251 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733883879.251044
2024-12-10 21:24:39,251 [DEBUG] Data: None
2024-12-10 21:24:39,251 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:39,555 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4999
2024-12-10 21:24:39,555 [DEBUG] Response: 200 (4999 bytes) (rst-320:rem-993.0:used-7 ratelimit) at 1733883879.555817
2024-12-10 21:24:39,558 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1ha9il4/ at 1733883879.558582
2024-12-10 21:24:39,558 [DEBUG] Data: None
2024-12-10 21:24:39,558 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:39,862 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1ha9il4/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3798
2024-12-10 21:24:39,863 [DEBUG] Response: 200 (3798 bytes) (rst-320:rem-992.0:used-8 ratelimit) at 1733883879.863261
2024-12-10 21:24:39,864 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733883879.8647628
2024-12-10 21:24:39,864 [DEBUG] Data: None
2024-12-10 21:24:39,864 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:40,291 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4198
2024-12-10 21:24:40,292 [DEBUG] Response: 200 (4198 bytes) (rst-320:rem-991.0:used-9 ratelimit) at 1733883880.292479
2024-12-10 21:24:40,294 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733883880.294344
2024-12-10 21:24:40,294 [DEBUG] Data: None
2024-12-10 21:24:40,294 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:40,579 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4231
2024-12-10 21:24:40,580 [DEBUG] Response: 200 (4231 bytes) (rst-319:rem-990.0:used-10 ratelimit) at 1733883880.580551
2024-12-10 21:24:40,582 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733883880.582158
2024-12-10 21:24:40,582 [DEBUG] Data: None
2024-12-10 21:24:40,582 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:24:40,987 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6812
2024-12-10 21:24:40,988 [DEBUG] Response: 200 (6812 bytes) (rst-319:rem-989.0:used-11 ratelimit) at 1733883880.988516
2024-12-10 21:24:40,991 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733883880.991144
2024-12-10 21:24:40,991 [DEBUG] Data: None
2024-12-10 21:24:40,991 [DEBUG] Params: {'after': 't3_1h55mng', 'limit': 200, 'raw_json': 1}
2024-12-10 21:24:42,319 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h55mng&raw_json=1 HTTP/11" 200 50937
2024-12-10 21:24:42,502 [DEBUG] Response: 200 (50937 bytes) (rst-318:rem-988.0:used-12 ratelimit) at 1733883882.502246
2024-12-10 21:24:42,515 [INFO] Fetched and filtered 10 posts.
2024-12-10 21:24:42,524 [ERROR] Error in script execution.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 231, in run_script
good_posts = filter_posts_with_gpt(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 103, in filter_posts_with_gpt
posts_summary = "".join(summarize_post_for_gpt(p) for p in posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 103, in <genexpr>
posts_summary = "".join(summarize_post_for_gpt(p) for p in posts)
^^^^^^^^^^^^^^^^^^^^^^
NameError: name 'summarize_post_for_gpt' is not defined
2024-12-10 21:29:48,477 [INFO] Successfully connected to the Reddit API.
2024-12-10 21:29:48,478 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733884188.478265
2024-12-10 21:29:48,478 [DEBUG] Data: None
2024-12-10 21:29:48,478 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-10 21:29:48,481 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-10 21:29:48,920 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 652
2024-12-10 21:29:48,923 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-10 21:29:50,059 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 51584
2024-12-10 21:29:50,225 [DEBUG] Response: 200 (51584 bytes) (rst-10:rem-987.0:used-13 ratelimit) at 1733884190.225695
2024-12-10 21:29:50,237 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbam3w/ at 1733884190.2378602
2024-12-10 21:29:50,238 [DEBUG] Data: None
2024-12-10 21:29:50,238 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:50,561 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbam3w/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3487
2024-12-10 21:29:50,561 [DEBUG] Response: 200 (3487 bytes) (rst-9:rem-986.0:used-14 ratelimit) at 1733884190.561592
2024-12-10 21:29:50,562 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hb3wd9/ at 1733884190.562266
2024-12-10 21:29:50,562 [DEBUG] Data: None
2024-12-10 21:29:50,562 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:50,869 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hb3wd9/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3666
2024-12-10 21:29:50,870 [DEBUG] Response: 200 (3666 bytes) (rst-9:rem-985.0:used-15 ratelimit) at 1733884190.870095
2024-12-10 21:29:50,871 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hazjmj/ at 1733884190.871546
2024-12-10 21:29:50,871 [DEBUG] Data: None
2024-12-10 21:29:50,871 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:51,154 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hazjmj/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3106
2024-12-10 21:29:51,155 [DEBUG] Response: 200 (3106 bytes) (rst-9:rem-984.0:used-16 ratelimit) at 1733884191.1553018
2024-12-10 21:29:51,156 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1harj8o/ at 1733884191.15636
2024-12-10 21:29:51,156 [DEBUG] Data: None
2024-12-10 21:29:51,156 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:51,582 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1harj8o/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 8680
2024-12-10 21:29:51,609 [DEBUG] Response: 200 (8680 bytes) (rst-8:rem-983.0:used-17 ratelimit) at 1733884191.6090791
2024-12-10 21:29:51,611 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hae4tc/ at 1733884191.611627
2024-12-10 21:29:51,611 [DEBUG] Data: None
2024-12-10 21:29:51,611 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:51,993 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hae4tc/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4996
2024-12-10 21:29:51,995 [DEBUG] Response: 200 (4996 bytes) (rst-8:rem-982.0:used-18 ratelimit) at 1733884191.995029
2024-12-10 21:29:51,997 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1ha9il4/ at 1733884191.997531
2024-12-10 21:29:51,997 [DEBUG] Data: None
2024-12-10 21:29:51,997 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:52,504 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1ha9il4/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3798
2024-12-10 21:29:52,505 [DEBUG] Response: 200 (3798 bytes) (rst-7:rem-981.0:used-19 ratelimit) at 1733884192.505054
2024-12-10 21:29:52,506 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9vjwq/ at 1733884192.5064409
2024-12-10 21:29:52,506 [DEBUG] Data: None
2024-12-10 21:29:52,506 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:52,811 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9vjwq/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4195
2024-12-10 21:29:53,185 [DEBUG] Response: 200 (4195 bytes) (rst-7:rem-980.0:used-20 ratelimit) at 1733884193.185298
2024-12-10 21:29:53,188 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9v7dz/ at 1733884193.188737
2024-12-10 21:29:53,188 [DEBUG] Data: None
2024-12-10 21:29:53,188 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:53,529 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9v7dz/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 2882
2024-12-10 21:29:53,530 [DEBUG] Response: 200 (2882 bytes) (rst-6:rem-979.0:used-21 ratelimit) at 1733884193.5301068
2024-12-10 21:29:53,531 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9omov/ at 1733884193.531501
2024-12-10 21:29:53,531 [DEBUG] Data: None
2024-12-10 21:29:53,531 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:53,692 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9omov/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4230
2024-12-10 21:29:53,703 [DEBUG] Response: 200 (4230 bytes) (rst-6:rem-978.0:used-22 ratelimit) at 1733884193.703344
2024-12-10 21:29:53,704 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1h9fg8c/ at 1733884193.704766
2024-12-10 21:29:53,704 [DEBUG] Data: None
2024-12-10 21:29:53,704 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:29:54,041 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1h9fg8c/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 6812
2024-12-10 21:29:54,043 [DEBUG] Response: 200 (6812 bytes) (rst-6:rem-977.0:used-23 ratelimit) at 1733884194.043029
2024-12-10 21:29:54,045 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733884194.0454772
2024-12-10 21:29:54,045 [DEBUG] Data: None
2024-12-10 21:29:54,045 [DEBUG] Params: {'after': 't3_1h55mng', 'limit': 200, 'raw_json': 1}
2024-12-10 21:29:55,060 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&after=t3_1h55mng&raw_json=1 HTTP/11" 200 50902
2024-12-10 21:29:55,204 [DEBUG] Response: 200 (50902 bytes) (rst-5:rem-976.0:used-24 ratelimit) at 1733884195.2047071
2024-12-10 21:29:55,215 [INFO] Fetched and filtered 10 posts.
2024-12-10 21:29:55,229 [ERROR] Error filtering posts with GPT-3.5.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 127, in filter_posts_with_gpt
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-10 21:29:55,230 [ERROR] Error in script execution.
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 127, in filter_posts_with_gpt
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
...<5 lines>...
temperature=0.7,
)
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/site-packages/openai/lib/_old_api.py", line 39, in __call__
raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 260, in run_script
good_posts = filter_posts_with_gpt(posts)
File "/Users/chrisgaya/openai-python/reddit_crawler.py", line 144, in filter_posts_with_gpt
raise RuntimeError(f"Error filtering posts with GPT-3.5: {e}")
RuntimeError: Error filtering posts with GPT-3.5:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.
Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`
A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742
2024-12-10 21:46:48,030 [INFO] Successfully connected to the Reddit API.
2024-12-10 21:46:48,031 [DEBUG] Fetching: GET https://oauth.reddit.com/r/dropship/new at 1733885208.031669
2024-12-10 21:46:48,031 [DEBUG] Data: None
2024-12-10 21:46:48,031 [DEBUG] Params: {'limit': 200, 'raw_json': 1}
2024-12-10 21:46:48,034 [DEBUG] Starting new HTTPS connection (1): www.reddit.com:443
2024-12-10 21:46:48,929 [DEBUG] https://www.reddit.com:443 "POST /api/v1/access_token HTTP/11" 200 653
2024-12-10 21:46:48,932 [DEBUG] Starting new HTTPS connection (1): oauth.reddit.com:443
2024-12-10 21:46:50,569 [DEBUG] https://oauth.reddit.com:443 "GET /r/dropship/new?limit=200&raw_json=1 HTTP/11" 200 51586
2024-12-10 21:46:50,874 [DEBUG] Response: 200 (51586 bytes) (rst-190:rem-999.0:used-1 ratelimit) at 1733885210.874937
2024-12-10 21:46:50,886 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbej76/ at 1733885210.886812
2024-12-10 21:46:50,886 [DEBUG] Data: None
2024-12-10 21:46:50,887 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:46:51,385 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbej76/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 4385
2024-12-10 21:46:51,386 [DEBUG] Response: 200 (4385 bytes) (rst-189:rem-998.0:used-2 ratelimit) at 1733885211.386742
2024-12-10 21:46:51,389 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hbam3w/ at 1733885211.38942
2024-12-10 21:46:51,389 [DEBUG] Data: None
2024-12-10 21:46:51,389 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:46:51,798 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hbam3w/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3484
2024-12-10 21:46:51,800 [DEBUG] Response: 200 (3484 bytes) (rst-188:rem-997.0:used-3 ratelimit) at 1733885211.800368
2024-12-10 21:46:51,802 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hb3wd9/ at 1733885211.802351
2024-12-10 21:46:51,802 [DEBUG] Data: None
2024-12-10 21:46:51,802 [DEBUG] Params: {'limit': 2048, 'raw_json': 1, 'sort': 'confidence'}
2024-12-10 21:46:52,453 [DEBUG] https://oauth.reddit.com:443 "GET /comments/1hb3wd9/?limit=2048&sort=confidence&raw_json=1 HTTP/11" 200 3657
2024-12-10 21:46:52,455 [DEBUG] Response: 200 (3657 bytes) (rst-188:rem-996.0:used-4 ratelimit) at 1733885212.4555871
2024-12-10 21:46:52,458 [DEBUG] Fetching: GET https://oauth.reddit.com/comments/1hazjmj/ at 1733885212.458496
2024-12-10 21:46:52,458 [DEBUG] Data: None