Get the latest tech news
LLM not available in your area? Snowflake now enables cross-region inference
With one line of code, users can process requests on Snowflake's Cortex AI in a different region even if an LLM isn’t available in theirs.
But AI development is moving so quickly that some organizations don’t have a choice but to bide their time until models are available in their tech stack’s location — often due to resource challenges, western-centric bias and multilingual barriers. Organizations can now privately and securely use LLMs in the U.S., EU and Asia Pacific and Japan (APJ) without incurring additional egress charges. Agarwal explains that if both regions operate on Amazon Web Services(AWS), data will privately cross that global network and remain securely within it due to automatic encryption at the physical layer.
Or read this on Venture Beat