aster.cloud aster.cloud
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
aster.cloud aster.cloud
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
  • Data
  • Engineering

Access Larger Dataset Faster And Easier To Accelerate Your ML Models Training In Vertex AI

  • aster.cloud
  • August 19, 2022
  • 3 minute read

Vertex AI Training delivers a serverless approach to simplify the ML model training experience for customers. As such, training data does not persist on the compute clusters by design. In the past, customers had only Cloud Storage (GCS) or BigQuery (BQ) as storage options. Now, you can also use NFS shares, such as Filestore, for training jobs and access data in the NFS share as you would files in a local file system.

Built-in NFS support for custom training jobs provides the following benefits:


Partner with aster.cloud
for your next big idea.
Let us know here.



From our partners:

CITI.IO :: Business. Institutions. Society. Global Political Economy.
CYBERPOGO.COM :: For the Arts, Sciences, and Technology.
DADAHACKS.COM :: Parenting For The Rest Of Us.
ZEDISTA.COM :: Entertainment. Sports. Culture. Escape.
TAKUMAKU.COM :: For The Hearth And Home.
ASTER.CLOUD :: From The Cloud And Beyond.
LIWAIWAI.COM :: Intelligence, Inside and Outside.
GLOBALCLOUDPLATFORMS.COM :: For The World's Computing Needs.
FIREGULAMAN.COM :: For The Fire In The Belly Of The Coder.
ASTERCASTER.COM :: Supra Astra. Beyond The Stars.
BARTDAY.COM :: Prosperity For Everyone.

  • Delivers an easy way to store and access large datasets for Vertex AI Training with less of the cumbersome work involving moving training data around.
  • Training jobs execute faster by eliminating the data download steps.
  • Data streams over the network with higher throughput compared to using alternative storage solutions.

This article demonstrates how to create a Filestore instance and how to use the data that’s stored in the instance to train a model with your custom training code.

Create a Filestore instance and copy data

First let’s create a Filestore instance as our NFS file server.

In the Cloud Console, go to the Filestore Instances page and click Create instance.

 

Configure the instance based on your needs, noting the following:

  • For this tutorial, we used the “default” VPC network for simplicity. You may choose any network you want, but save the network name as we will need it later.
  • Ensure that you are using “private service access” as the connection mode.

For in depth instructions, see Creating instances.

 

Your  new instance will show on the dashboard page. Click on the name of the instance to view the details of the instance.

Read More  Resources to Take Your Charts From Bland to Beautiful

 

 

Save the NFS mount point information, which is in the form of SERVER:PATH. We will use it later.

 

Copy data to your instance by following the instructions from the official guide.

Set up VPC Network Peering

Since we chose “private service access” mode for our Filestore instance as mentioned above, we already have VPC peering established between our network and Google services. If you’re using a third party NFS solution, you may need to set up the peering yourself as instructed in Set up VPC Network Peering.

Create a Custom Job accessing NFS

Once you have the NFS share and VPC peering set up, you are ready to use it with your custom training jobs.  In this section, we will use the gcloud CLI to create a custom training job that can access the files in your NFS share.

To be specific, the process can be simplified into following general steps:

  1. Decide a mount point directory under the path /mnt/nfs/. Your NFS share will be mounted to this directory when you submit jobs.
  2. In your custom code,  you can access your NFS file share via the local path to your mount point directory.
  3. Specify  the “nfsMount” field and network fields in your training job request and submit it.

For example, we make my_mount the “Mount Point” folder. Then in our custom code, we can specify /mnt/nfs/my_mount to get the data stored in our Filestore instance:

 

with open('/mnt/nfs/my_mount/data.csv', 'r') as f:
  lines = f.readlines()

 

We may also write to the Filestore instance via that local path:

Read More  Introducing Custom Organization Policy For GKE To Harden Security

 

with open('/mnt/nfs/my_mount/epoch3.log', 'a') as f:
  f.write('success!\n')

 

Here, suppose that we built a custom container image gcr.io/PROJECT_ID/nfs-demo containing the above code for submitting our training job. We can run commands like the following:

 

PROJECT_ID='YOUR-PROJECT'
LOCATION=us-central1
JOB_NAME='nfs-demo'
IMAGE_URI="gcr.io/$PROJECT_ID/nfs-demo:latest"

gcloud ai custom-jobs create \
 --project=${PROJECT_ID} \
 --region=${LOCATION} \
 --display-name=${JOB_NAME} \
 --config=config.yaml

 


 

The config.yaml file describes the CustomJobSpec and it should have the network and NFS mounts settings, like the following:

 

network: projects/PROJECT_NUMBER/global/networks/default
workerPoolSpecs:
   - machineSpec:
       machineType: n1-standard-8
     replicaCount: 1
     containerSpec:
       imageUri: 'gcr.io/PROJECT_ID/nfs-demo:latest'
     nfsMounts:
       - server: 10.76.0.10
         path: /fileshare
         mountPoint: my_mount

 

Then we can check the status of your training job and see how it successfully reads/writes the data from your NFS file shares.

Summary

In this article, we used Filestore to demonstrate how to access files in an NFS share by mounting it to Vertex AI. We created a Filestore instance and VPC peering connections, and then submitted a job that can directly read from Filestore as a local directory.

By leveraging the performance and throughput benefits of streaming data from NFS shares such as Filestore, it simplifies and accelerates the process to run training jobs on Vertex AI, which empowers users to train even better models with more data.

  • To learn more about using NFS file systems with Vertex AI, see NFS support on Vertex AI training.
  • To learn more  about Vertex AI, check out this blog post from our developer advocates.

 

 

By: Manqing Feng (Software Engineering Intern) and Nathan Li (Software Engineer)
Source: Google Cloud Blog


For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

aster.cloud

Related Topics
  • Data
  • Filestore
  • Google Cloud
  • NFS
  • Tutorials
  • Vertex AI
You May Also Like
View Post
  • Engineering

Just make it scale: An Aurora DSQL story

  • May 29, 2025
Getting things done makes her feel amazing
View Post
  • Computing
  • Data
  • Featured
  • Learning
  • Tech
  • Technology

Nurturing Minds in the Digital Revolution

  • April 25, 2025
View Post
  • Engineering
  • Technology

Guide: Our top four AI Hypercomputer use cases, reference architectures and tutorials

  • March 9, 2025
View Post
  • Computing
  • Engineering

Why a decades old architecture decision is impeding the power of AI computing

  • February 19, 2025
View Post
  • Engineering
  • Software Engineering

This Month in Julia World

  • January 17, 2025
View Post
  • Engineering
  • Software Engineering

Google Summer of Code 2025 is here!

  • January 17, 2025
View Post
  • Data
  • Engineering

Hiding in Plain Site: Attackers Sneaking Malware into Images on Websites

  • January 16, 2025
View Post
  • Computing
  • Design
  • Engineering
  • Technology

Here’s why it’s important to build long-term cryptographic resilience

  • December 24, 2024

Stay Connected!
LATEST
  • 1
    The Summer Adventures : Hiking and Nature Walks Essentials
    • June 2, 2025
  • 2
    Just make it scale: An Aurora DSQL story
    • May 29, 2025
  • 3
    Reliance on US tech providers is making IT leaders skittish
    • May 28, 2025
  • Examine the 4 types of edge computing, with examples
    • May 28, 2025
  • AI and private cloud: 2 lessons from Dell Tech World 2025
    • May 28, 2025
  • 6
    TD Synnex named as UK distributor for Cohesity
    • May 28, 2025
  • Weigh these 6 enterprise advantages of storage as a service
    • May 28, 2025
  • 8
    Broadcom’s ‘harsh’ VMware contracts are costing customers up to 1,500% more
    • May 28, 2025
  • 9
    Pulsant targets partner diversity with new IaaS solution
    • May 23, 2025
  • 10
    Growing AI workloads are causing hybrid cloud headaches
    • May 23, 2025
about
Hello World!

We are aster.cloud. We’re created by programmers for programmers.

Our site aims to provide guides, programming tips, reviews, and interesting materials for tech people and those who want to learn in general.

We would like to hear from you.

If you have any feedback, enquiries, or sponsorship request, kindly reach out to us at:

[email protected]
Most Popular
  • Understand how Windows Server 2025 PAYG licensing works
    • May 20, 2025
  • By the numbers: How upskilling fills the IT skills gap
    • May 21, 2025
  • 3
    Cloud adoption isn’t all it’s cut out to be as enterprises report growing dissatisfaction
    • May 15, 2025
  • 4
    Hybrid cloud is complicated – Red Hat’s new AI assistant wants to solve that
    • May 20, 2025
  • 5
    Google is getting serious on cloud sovereignty
    • May 22, 2025
  • /
  • Technology
  • Tools
  • About
  • Contact Us

Input your search keywords and press Enter.