IRCaBot 2.1.0
GPLv3 © acetone, 2021-2022
#i2p-dev
/2024/01/30
@eyedeekay
&eche|on
&kytv
&zzz
+R4SAS
+RN
+StormyCloud
+T3s|4
+acetone
+dr|z3d
+goose2_
+hagen
+orignal
+postman
+weko
An0nm0n
Arch
Danny
DeltaOreo
DiCEy1904
FreefallHeavens
Irc2PGuest48909
Irc2PGuest71836
Nausicaa
Onn4l7h
SoniEx2
T3s|4_
Teeed
anon2
b3t4f4c3__
bak83_
boonst
cumlord
dr4wd3
eyedeekay_bnc
goose2
hk
itsjustme
j6
mareki2p
numberwang
onon_1
poriori
profetikla
qend-irc2p
rapidash
shiver_
u5657
unwr
user
veiledwizard
w8rabbit
x74a6
orignal yes, good point
orignal for example, if router is ygg only
orignal thoretically it should be "R" because it reachable by some transport
orignal pratctially it's not
orignal so, since no R or U it means I'm doing right
eyedeekay Oh that's dumb. the cronjob that downloads my plugins from github and put them onto my eepsite has been hitting the github rate-limit every time it runs for months
StormyCloud them darn rate-limits
eyedeekay That's why the only plugin anyone can ever download is railroad, the scripts have moved them all to the backup location then failed to download the new plugin and also failed to return the backup to the correct location
eyedeekay because wget is cheerfully naming the text of the error page "package.tar.gz" or whatever because that's what I told it to do
dr|z3d why don't you move the build process in house, eyedeekay?
dr|z3d oh, wait, you're saying you've rate-limited your own github instance?
eyedeekay No what actually happens is that I build everything on my laptop then I push it all up to github then I either ssh in to my eepsite host(s there are 2) and:
eyedeekay sudo -u i2psvc bash -i
eyedeekay cd ~/i2p-config/eepsite/docroot
eyedeekay find . -type d -maxdepth 1 -exec bash -c "cd {} && edgar" \;
eyedeekay where `edgar` re-generates the pages locally and downloads the releases back from github to the eepsite host
eyedeekay OR I wait for the cronjob to do the same thing
eyedeekay But since I have like a hundred or so various artifacts to download from github I hit the rate-limit before it finishes
eyedeekay Basically I use github as a mirror and to store the files while I transfer them to the eepsite hosts, it's also why I go to lengths to make my eepsite and my github page look exactly the same, ideally they should be exactly the same
eyedeekay But even the github page gets built locally, on my laptop, in the ~/.i2p/eepsite/docroot actually so I can see it
eyedeekay All github ever does is copy the files
dr|z3d well, you need to make some adjustments to avoid being rate limited, this much I can tell you :)
dr|z3d well, if you're building the eepsite stuff locally, maybe you can bypass github for .i2p?
eyedeekay and I'd have a new instance ready to multihome right next to the others
dr|z3d you'll figure it out I'm sure. you love complexity :)
eyedeekay Less so after having introduced so much of it into my life lol
dr|z3d well, when the workflow you've introduced to make your life easier starts making your life harder, time to rethink.
eyedeekay I'm trying to get it back to the point where if I just clone everything and download the release assets, I'm done, no more recursing or running make targets, just mirroring
eyedeekay That was the original idea
eyedeekay But I guess I need an API key for the mirroring part
eyedeekay There that's the rate-limit taken care of
dr|z3d *thumbs up*