Asked by:
Upload local file to azure data lake store using power shell

Question
-
Hi All,
we are trying to upload local files to data lake store using power shell, please find the script the follows.
Login-AzureRmAccount
$resourceGroupName ="adlstest"
$resourceGroupLocation='east us 2'
$dataLakeStoreName = "datalakestorename"
$resourceGroup = Get-AzureRmResourceGroup -Name $resourceGroupName -ErrorAction SilentlyContinue
if(!$resourceGroup)
{
Write-Host "Resource group '$resourceGroupName' does not exists, creating a new resource group";
New-AzureRmResourceGroup -Name $resourceGroupName -Location $resourceGroupLocation
}
else{
Write-Host "Using existing resource group '$resourceGroupName'";
}
$checkadlsaccount=Test-AzureRmDataLakeStoreAccount -Name $dataLakeStoreName
if($checkadlsaccount)
{
Write-Host "$dataLakeStoreName account exists";
}
else
{
Write-Host "$dataLakeStoreName account does not exists, creating the same";
New-AzureRmDataLakeStoreAccount -ResourceGroupName $resourceGroupName -Name $dataLakeStoreName -Location $resourceGroupLocation
if ((Test-AzureRmDataLakeStoreAccount -Name $dataLakeStoreName) -eq $True)
{
Write-Host "Data Lake Store $dataLakeStoreName created successfully!!!"
}
else
{
Write-Host "Data Lake Store $dataLakeStoreName failed to create"
}
}
$myrootdir = "/"
New-AzureRmDataLakeStoreItem -Folder -AccountName $dataLakeStoreName -Path $myrootdir/analytics
$a=Get-AzureRmDataLakeStoreChildItem -AccountName $dataLakeStoreName -Path $myrootdir/analytics
Import-AzureRmDataLakeStoreItem -AccountName $dataLakeStoreName -Path "localpath" -Destination $myrootdir\analytics\Book1.xlsx -Force $Truewhen execute this script for the first time it is executing(able to upload) the file. When i try to do the same for second time, we facing some problem, please find the error we are getting as follows.
Import-AzureRmDataLakeStoreItem : Upload operation failed due to the following underlying error: System.IO.DirectoryNotFoundException: Could not find a part of the path 'C:\Users\LabUser\AppData\Local\Temp\1\FinancialSample.xlsx.transfer.xml'. at Microsoft.Azure.Commands.DataLakeStore.Models.DataLakeStoreFileSystemClient.TrackUploadProgress(Task uploadTask, ProgressRecord uploadProgress, Cmdlet commandToUpdateProgressFor, CancellationToken token) at Microsoft.Azure.Commands.DataLakeStore.Models.DataLakeStoreFileSystemClient.CopyFile(String destinationPath, String accountName, String sourcePath, CancellationToken cmdletCancellationToken, Int32 threadCount, Boolean overwrite, Boolean resume, Boolean isBinary, Boolean isDownload, Cmdlet cmdletRunningRequest, ProgressRecord parentProgress). You can try to resume the upload by specifying the "Resume" option. If the error persists, please contact Microsoft support. i am wondering that, why appdata folder is being accessed here?
if you have any idea, please help us...
Thanks..
- Edited by Bodempudi Venkat Thursday, November 30, 2017 3:00 PM
- Moved by Bill_Stewart Friday, January 26, 2018 2:59 PM Off-topic
Thursday, November 30, 2017 2:58 PM
All replies
-
This should be posted in the Azure scripting forum for the best answers.
\_(ツ)_/
Thursday, November 30, 2017 8:25 PM -
Hi All,
we are trying to upload local files to data lake store using power shell, please find the script the follows.
Login-AzureRmAccount
$resourceGroupName ="adlstest"
$resourceGroupLocation='east us 2'
$dataLakeStoreName = "datalakestorename"
$resourceGroup = Get-AzureRmResourceGroup -Name $resourceGroupName -ErrorAction SilentlyContinue
if(!$resourceGroup)
{
Write-Host "Resource group '$resourceGroupName' does not exists, creating a new resource group";
New-AzureRmResourceGroup -Name $resourceGroupName -Location $resourceGroupLocation
}
else{
Write-Host "Using existing resource group '$resourceGroupName'";
}
$checkadlsaccount=Test-AzureRmDataLakeStoreAccount -Name $dataLakeStoreName
if($checkadlsaccount)
{
Write-Host "$dataLakeStoreName account exists";
}
else
{
Write-Host "$dataLakeStoreName account does not exists, creating the same";
New-AzureRmDataLakeStoreAccount -ResourceGroupName $resourceGroupName -Name $dataLakeStoreName -Location $resourceGroupLocation
if ((Test-AzureRmDataLakeStoreAccount -Name $dataLakeStoreName) -eq $True)
{
Write-Host "Data Lake Store $dataLakeStoreName created successfully!!!"
}
else
{
Write-Host "Data Lake Store $dataLakeStoreName failed to create"
}
}
$myrootdir = "/"
New-AzureRmDataLakeStoreItem -Folder -AccountName $dataLakeStoreName -Path $myrootdir/analytics
$a=Get-AzureRmDataLakeStoreChildItem -AccountName $dataLakeStoreName -Path $myrootdir/analytics
Import-AzureRmDataLakeStoreItem -AccountName $dataLakeStoreName -Path "localpath" -Destination $myrootdir\analytics\Book1.xlsx -Force $Truewhen execute this script for the first time it is executing(able to upload) the file. When i try to do the same for second time, we facing some problem, please find the error we are getting as follows.
Import-AzureRmDataLakeStoreItem : Upload operation failed due to the following underlying error: System.IO.DirectoryNotFoundException: Could not find a part of the path 'C:\Users\LabUser\AppData\Local\Temp\1\FinancialSample.xlsx.transfer.xml'. at Microsoft.Azure.Commands.DataLakeStore.Models.DataLakeStoreFileSystemClient.TrackUploadProgress(Task uploadTask, ProgressRecord uploadProgress, Cmdlet commandToUpdateProgressFor, CancellationToken token) at Microsoft.Azure.Commands.DataLakeStore.Models.DataLakeStoreFileSystemClient.CopyFile(String destinationPath, String accountName, String sourcePath, CancellationToken cmdletCancellationToken, Int32 threadCount, Boolean overwrite, Boolean resume, Boolean isBinary, Boolean isDownload, Cmdlet cmdletRunningRequest, ProgressRecord parentProgress). You can try to resume the upload by specifying the "Resume" option. If the error persists, please contact Microsoft support. i am wondering that, why appdata folder is being accessed here?
if you have any idea, please help us...
Thanks..
- Merged by jrv Friday, December 1, 2017 10:48 AM DUPLICATE
Friday, December 1, 2017 10:43 AM