I'm wondering if it would be useful to create a discovery script in the ~/scripts folder which gathers up information and places it into a temporary folder.
This could potentially be useful in the Get Help section. It feels like we ask for information but the new users aren't always savvy enough to go chase that down.
In a perfect world and without the EU tightening down on data privacy, the user could click a Get Support button somewhere in OctoPrint and this script would kickoff and upload a zipped folder to this forum software.
...Or maybe a cloud-based troubleshooting wizard which runs from the OctoPrint session. I'm imagining a plugin which displays a dialog box, iteratively asks questions (that were given from a remote JSON response) and collects answers (submitting these back to the cloud to receive the next question).
If that's the case, the plugin would need the AWS credentials or similar in its code, opening up some security issues for the plugin author. And also, even though AWS has an expiration feature built into the upload code which you can exercise, it just doesn't seem to work so the bucket will stockpile these unless you clear them out yourself.
A quick serverless script takes care of the credentials, and can cut down on being used as a random upload tool enough to not be a headache. S3 object expiration works fine- but a pile of compressed log files take up basically zero space.
Then you're luckier than I am (or you were in a different AWS region than my own) because I observed that the files did not auto-expire on their expiration date. Since each of mine were 3GB-ish in size and the frequency was about two per day this stacked up quite a bit over the span of two weeks if I didn't manually remove them.
Now that this forum will accept .zip file attachments, I think the "discovery" script should collect the relevant information into a .zip file and have the user's browser download it then redirect the user's browser to this forum (perhaps even to the Get Help category) with instructions to upload the just downloaded .zip file.
I think this should be possible if the "discovery" script was initiated from the OctoPrint page(s) displayed in the user's browser.
the plugin then opens a link to that URL in another tab and since the user read the earlier dialog, they know to download/save the zip file to their Desktop, say
since the user read the earlier dialog, they know to remember to upload that zip file that's on their Desktop
Holding the hand of a user who hasn't yet registered on the forum is a little problematic, methinks.
Also problematic is the EU thing that Gina's under. She won't likely want to be the person who's gathering the user's data like this if OctoPrint itself is doing it. If a plugin author wants to take that on, I'd think that she could wash her hands of it with respect to paperwork.
To be honest, I'd prefer to not drown in zip files. I already groan every time I have to download a log in order to take a look at it instead of just being able to click on a gist link. Those additional seconds of download, right click, Open with really add up over the day (plus the logs clutter up my hard drive).
What I'd like to have (and would have written if I only could find the time for it) is
gather logs + version info
push to dedicated service
service creates unique link which basically looks like a gist
user is prompted to share said link
Originally I was planning on just creating something like that with anonymous gist access (sharing a Github access token is an issue obviously) but then Github decided to shut that down (which I also can understand).
If someone could come up with something that ticks the above boxes, I'd be extremely happy.
GDPR wise, nothing must be sent automatically, so the user would need to consent to the collection & post to the service, and there would also need to be some kind of delete link ideally so the user can always delete the stuff on their own (less admin overhead). The collection of this kind of "personal data" under consent plus the fact that it is needed to fulfill a service (= help the user) should make it valid under Article 6 from my understanding.
Yes and I've met the owner in person; he's local here to San Diego. You can login with your github account and just create micro services.
As public microservices, they're free. If you want to make them private then they cost. The source is public so don't put anything in there that you want secret.
That would prevent anyone but a select few from reading it. Which means only a select few could help with log analysis. Doesn't sound like that would scale well. But maybe I'm completely misunderstanding you right now.