

Bet. Looking into that now. Thanks!
I believe I have 11g of vram, so I should be good to run decent models from what I’ve been told by the other AIs.
Bet. Looking into that now. Thanks!
I believe I have 11g of vram, so I should be good to run decent models from what I’ve been told by the other AIs.
Sever is my rig which is running windows. Phone is iPhone.
Exposing the port is something I’ve tried to do in the past with no success! When you say, change the bind address, do I do that in the windows defender firewall in the inbound rules section?
Backend/ front end. I see those a lot but I never got an explanation for it. In my case, the backend would be Ollama on my rig, and the front end would be me using it on my phone, whether that’s with and app or web ui. Is that correct?
I will add kobold to my list of AIs to check out in the future. Thanks!
Ollama has an app (or maybe interface is a better term for it) on windows right that I download models too. Then I can use said app to talk to the models. I believe Reins: Chat for Ollama is the app for iPhone that allows me to use my phone to chat with my models that are on the windows rig.
Yes exactly! I would love to keep it on my network for now. I’ve read that “exposing a port” is something I may have to do in my windows firewall options.
Yes I have Ollama on my windows rig. But im down to try out a different one if you suggest so. TBH, im not sure if librechat has a web ui. I think accessing the LLM on my phone via web browser would be easiest. But there are apps out there like Reins and Enchanted that I could take advantage of.
For right now I just want to do whatever is easiest so I can get a better understanding of what I’m doing wrong.
Oh! Also, I’m using windows on my PC. And my phone is an iPhone.
I’m not using Linux yet, but that is in my todo list for the future! After I get more comfortable with some more basics of self hosting.
Bet, I’ll try that when I get home tonight. If I don’t have success can I message you directly ?
Bet. I believe what you mentioned is best for accessing my LLM no matter where I am in the world, correct? If so I will try this one after I try what the other person suggested.
Thank you!
lol I have! They all say the same similar thing but it’s just not working for me.
I thought I was stuck there but I misspoke. I made an edit to the original post. Thanks for the insight tho! In sure that’ll be helpful once I actually get there.
I made an error in my original post. Please see the edit I made.
But I think I’m understanding a bit! I need to literally create a file named “/etc/radicale/config”. Then after that I need to copy/paste the configuration file/command line into said folder. Once I do that then I should be able to move onto authentication and then addresses.
I misspoke earlier. I wasn’t having issues with the addresses part, I didn’t even make it that far. I’m stuck on authentication. I updated the original post and added a pic for clarity.
Dope! This is exactly what I needed! I would say that this is a very “hand holding” explanation which is perfect because I’m starting with 0% knowledge in this field! And I learned so much already from this post and your comment!
So here’s where I’m at, -A backend is where all the weird c++ language stuff happens to generate a response from an AI. -a front end is a pretty app or webpage that takes that response and make it more digestible to the user. -agreed. I’ve seen in other posts that exposing a port on windows defender firewall is the easiest (and safest?) way to go for specifically what I’m looking for. I don’t think I need to forward a port as that would be for more remote access. -I went to the whatismyipaddress website. The ipv6 was identical to one of the ones I have. The ipv4 was not identical. (But I don’t think that matters moving forward.) -I did the ipconfig in the command prompt terminal to find the info and my ipv4 is 10.blahblahblah.
I’m ready for the next steps!