Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
684 views
in Technique[技术] by (71.8m points)

filesystems - How to connect my window network share path via Azure data bricks

I have files in local windows network file share path. I could access the file via Azure ADF using Self hosted IR. But we need to load those files via data bricks.

have tried below code

spark.read.csv('file:///networkpath/folder/', header="true", inferSchema="true")

Also tried loading the file via UI upload manually it's working fine.

But need know how to automate this file upload to DFS files system.

question from:https://stackoverflow.com/questions/65924739/how-to-connect-my-window-network-share-path-via-azure-data-bricks

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Unfortunately, Azure Databricks doesn't support connect Windows Network Share.

Note: It's is highly recommended: Do not Store any Production Data in Default DBFS Folders

enter image description here

There are multiple ways to upload files from a local machine to the Azure Databricks DBFS folder.

Method1: Using the Azure Databricks portal.

enter image description here

Method2: Using Databricks CLI

The DBFS command-line interface (CLI) uses the DBFS API to expose an easy to use command-line interface to DBFS. Using this client, you can interact with DBFS using commands similar to those you use on a Unix command line. For example:

# List files in DBFS
dbfs ls
# Put local file ./apple.txt to dbfs:/apple.txt
dbfs cp ./apple.txt dbfs:/apple.txt
# Get dbfs:/apple.txt and save to local file ./apple.txt
dbfs cp dbfs:/apple.txt ./apple.txt
# Recursively put local dir ./banana to dbfs:/banana
dbfs cp -r ./banana dbfs:/banana

enter image description here

Method3: Using third-party tool named DBFS Explorer

DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and Azure instances of Databricks. You will need to create a bearer token in the web interface in order to connect.

Step1: Download and install DBFS Explorer and install it.

Step2: Open DBFS Explorer and Enter: Databricks URL and Personal Access Token

enter image description here

Step3: Select the folder where you want to upload the files from the local machine and just drag and drop in the folder to upload and click upload.

enter image description here


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...