jetbrains / teamcity-azure-storage Goto Github PK
View Code? Open in Web Editor NEWTeamCity Azure artifacts storage support plugin
License: Apache License 2.0
TeamCity Azure artifacts storage support plugin
License: Apache License 2.0
When this plugin is enabled, the artifacts publish great to Azure storage. However those artifacts are not available to be pulled via the nuget feed. Am I missing a configuration?
Builds with artifacts > than a couple MB in size fail to upload with [AzureArtifactsPublisher] plugin. Buid log just shows the following generic error message:
Failed to publish artifacts: java.io.IOException
The last build I saw this on produced a single 22MB artifact .
NuGet Pack with "Publish created packages to build artifacts" checked publishes the nugget package artifacts to the correct azure blob storage. NuGet Publish does not however. I already have a NuGet package built and cannot use the NuGet Pack, so NuGet Publish is the right choice but is does not seem to work together with Azure Storage.
Is there a way (or could it be added as a new feature) to move existing build artifacts into Azure storage?
It would be nice to be able to consolidate all build artifacts (existing as well as new ones) into the same place.
Alternatively, are there manual steps that can be taken to achieve the same result?
By default, when artifacts are uploaded to Azure Blob Storage, they will be uploaded in the nested path structure:
container/
└── Project
└── Project_Build
└── 100
├── my-app-1521658753171-43bb3c049af9850c75a9.js
└── my-app-latest.js
However, I want to upload blobs to a well-known location such as:
container/
├── my-app-1521658753171-43bb3c049af9850c75a9.js
└── my-app-latest.js
Is this currently possible? It would also require overwriting any existing files with the same path (e.g. my-app-latest.js
)
The myLinksCache expires items after urlLifeTime. The Azure URLs are generated using that exact same duration. The default is 60 seconds.
The overall process is:
The time between step 1 and 3, and step 6 and 7 is non-zero. Therefore the cache can return a link that is almost expired or has expired very recently.
My suggestion is that the url expiry is twice that of the cache time. A cache time of 60 seconds seems reasonable.
Plugin version: 0.5.0
Teamcity version: 2019.2 (build 71499)
For several months the plugin was working correctly. Yesterday however, the artifact download links stopped working for all builds in all projects.
No changes were made to teamcity or the blob storage.
The generated redirect download link yields the following error every time:
<Code>AuthenticationFailed</Code> <Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:xxxxx-edited-out-xxxxxxxxxxxxx Time:2020-11-03T23:52:59.2000012Z</Message> <AuthenticationErrorDetail>Signed expiry time [Tue, 03 Nov 2020 23:52:45 GMT] must be after signed start time [Tue, 03 Nov 2020 23:52:59 GMT]</AuthenticationErrorDetail>
By using this option it is possible to reuse security credentials from the virtual machine:
https://docs.microsoft.com/en-us/azure/active-directory/msi-tutorial-linux-vm-access-storage
In the AzureConstants
of the plugin I found the value storage.azure.url.expiration.time.seconds
which should override the default lifetime of the generated URLs (which is implemented to be 120 seconds). The value of this constant gets set by the call of TeamCityProperties.getInteger
from which I conclude it should be defined as a configuration parameter in the build pipelines project.
However, defining this parameter doesn't effect the expiration of the generated URLs in my case. Am I doing something wrong there or is this a bug?
Azure static website has a container named $web
. The plugin always creates a folder with name web
instead. If you specify the path it still pushes the artifacts to web
folder instead of $web
The plugin instantiates a storage account with the constructor that causes http to be used.
The issue is in AzureUtils.kt [1] (not sure if it's the only place)
fun getBlobClient(parameters: Map<String, String>): CloudBlobClient {
val accountName = parameters[AzureConstants.PARAM_ACCOUNT_NAME]?.trim()
val accountKey = parameters[AzureConstants.PARAM_ACCOUNT_KEY]?.trim()
return CloudStorageAccount(StorageCredentialsAccountAndKey(accountName, accountKey)).createCloudBlobClient()
}
I've found it with some investigation after I got an "Invalid account key" message in the UI when trying to configure an Azure Storage Account configured to accept secure connections only.
[1]
It will allow to prevent key exposure to the build agent machine.
Our artifact has size around 1GB and it's upload is too slow.
At the same time Azure Blob Storage client for Java has special property which enables concurrent requests ConcurrentRequestCount
https://docs.microsoft.com/en-us/java/api/com.microsoft.azure.storage.file._file_request_options.setconcurrentrequestcount
I changed file azure-storage-agent/src/main/kotlin/jetbrains/buildServer/artifacts/azure/publish/AzureArtifactsPublisher.kt
in function AzureArtifactsPublisher
val blobOptions = BlobRequestOptions()
blobOptions.concurrentRequestCount = 8
FileInputStream(file).use {
val length = file.length()
blob.upload(it, length, null, blobOptions, null)
val artifact = ArtifactDataInstance.create(filePath, length)
publishedArtifacts.add(artifact)
}
Then I build plugin with Grandle and change zip file on TeamCity agent machine. The result is so good:
I would be nice to realize the parallel upload.
Is it possible to allow inspecting of artifacts? It is a very useful feature.
Even tree view would be cool.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.