TempleOS
Troll honeypot, apparently.
Suggested blocks:
TempleOS
That sounds triggered.
name.com usually because I’m too dumb to remember any other domain name.
Acceptable isn’t a percentage, but I see in your opinion that it’s acceptable. Thanks for making your content open source. I do wish your project the best of luck. I don’t think I have what it takes to validate this myself but if I end up hosting an instance I’ll probably start using this tool more often myself. It’s better than nothing at at present I have zero instances but also zero mods lined up.
Just going to argue on behalf of the other users who know apparently way more than you and I do about this stuff:
WhY nOt juSt UsE thE FBi daTaBaSe of CSam?!
(because one doesn’t exist)
(because if one existed it would either be hosting CSAM itself or showing just the hashes of files - hashes which won’t match if even one bit is changed due to transmission data loss / corruption, automated resizing from image hosting sites, etc)
(because this shit is hard to detect)
Some sites have tried automated detection of CSAM images. Youtube, in an effort to try to protect children, continues to falsely flag 30 year old women as children.
OP, I’m not saying you should give up, and maybe what you’re working on could be the beginning of something that truly helps in the field of CSAM detection. I’ve got only one question for you (which hopefully won’t be discouraging to you or others): what’s your false-positive (or false-negative) detection rate? Or, maybe a question you may not want to answer: how are you training this?
Have you tried the Excalidraw plugin for obsidian? This may be closer to what you’re looking for. Otherwise, would the canvas feature do what you need?
try obsidian.md
Excalidraw is nice. Also, I want to throw in a mention for mermaid.live (mermaid js). A little less flexiblity but it’s nice. There’s also kroki.io which hosts a lot of these types of apps.