GithubHelp home page GithubHelp logo

jetbrains / teamcity-azure-storage Goto Github PK

View Code? Open in Web Editor NEW
15.0 18.0 7.0 207 KB

TeamCity Azure artifacts storage support plugin

License: Apache License 2.0

Kotlin 88.61% Java 11.32% CSS 0.07%
azure azure-storage teamcity artifacts teamcity-plugin

teamcity-azure-storage's Introduction

TeamCity Azure Storage

official JetBrains project plugin status

This plugin allows replacing the TeamCity built-in artifacts storage by Azure Blob storage. The artifacts storage can be changed at the project level. After changing the storage, new artifacts produced by the builds of this project will be published to the Azure Blob storage account. Besides publishing, the plugin also implements resolving of artifact dependencies and clean-up of build artifacts.

Features

When installed and configured, the plugin:

  • allows uploading artifacts to Azure Blob Storage
  • allows downloading artifacts from Azure Blob Storage
  • handles resolution of artifact dependencies
  • handles clean-up of artifacts
  • displays artifacts located in Azure Blob Storage in the TeamCity UI.

Download

You can download the plugin and install it as an additional TeamCity plugin.

Compatibility

The plugin is compatible with TeamCity 2017.1.1 and greater.

Configuring

The plugin adds the Artifacts Storage tab to the Project Settings page in the TeamCity Web UI. The tab lists the internal TeamCity artifacts storage displayed by default and marked as active.

To configure Azure Blob Storage for TeamCity artifacts, perform the following:

  1. Select Azure Storage as the storage type
  2. Fill in the account name and key
  3. Save your settings.

The configured Azure storage will appear on the Artifacts storage page. Make it active using the corresponding link. Now the artifacts of this project, its subprojects, and build configurations will be stored in the configured storage.

Build

This project uses gradle as the build system. You can easily open it in IntelliJ IDEA or Eclipse. To test & build the plugin, execute the build gradle command.

Contributions

We appreciate all kinds of feedback, so please feel free to send a PR or write an issue.

teamcity-azure-storage's People

Contributors

burnasheva avatar droyad avatar dtretyakov avatar ilyafomenko avatar julia-alexandrova avatar orybak avatar pavelsher avatar wayfarer-rus avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

teamcity-azure-storage's Issues

NuGet Publish still leaves artifacts in internal storage, not in azure blob storage.

NuGet Pack with "Publish created packages to build artifacts" checked publishes the nugget package artifacts to the correct azure blob storage. NuGet Publish does not however. I already have a NuGet package built and cannot use the NuGet Pack, so NuGet Publish is the right choice but is does not seem to work together with Azure Storage.

Setting URL expiration time not working

In the AzureConstants of the plugin I found the value storage.azure.url.expiration.time.seconds which should override the default lifetime of the generated URLs (which is implemented to be 120 seconds). The value of this constant gets set by the call of TeamCityProperties.getInteger from which I conclude it should be defined as a configuration parameter in the build pipelines project.

However, defining this parameter doesn't effect the expiration of the generated URLs in my case. Am I doing something wrong there or is this a bug?

Communication with Azure is over INSECURE HTTP

The plugin instantiates a storage account with the constructor that causes http to be used.

The issue is in AzureUtils.kt [1] (not sure if it's the only place)

    fun getBlobClient(parameters: Map<String, String>): CloudBlobClient {
        val accountName = parameters[AzureConstants.PARAM_ACCOUNT_NAME]?.trim()
        val accountKey = parameters[AzureConstants.PARAM_ACCOUNT_KEY]?.trim()
        return CloudStorageAccount(StorageCredentialsAccountAndKey(accountName, accountKey)).createCloudBlobClient()
}

I've found it with some investigation after I got an "Invalid account key" message in the UI when trying to configure an Azure Storage Account configured to accept secure connections only.

[1]

return CloudStorageAccount(StorageCredentialsAccountAndKey(accountName, accountKey)).createCloudBlobClient()

The link cache hangs on to the SAS urls too long

The myLinksCache expires items after urlLifeTime. The Azure URLs are generated using that exact same duration. The default is 60 seconds.

The overall process is:

  1. Calculate the link expiry date
  2. Request a URL for that expiry date
  3. Add the URL to the cache
  4. [Just under 60 seconds passes]
  5. Another build needs the link
  6. The link is in the cache, that link is returned
  7. The link is used to download the file

The time between step 1 and 3, and step 6 and 7 is non-zero. Therefore the cache can return a link that is almost expired or has expired very recently.

My suggestion is that the url expiry is twice that of the cache time. A cache time of 60 seconds seems reasonable.

Download links stopped working for artifacts hosted on Azure BLOB storage

Plugin version: 0.5.0
Teamcity version: 2019.2 (build 71499)
For several months the plugin was working correctly. Yesterday however, the artifact download links stopped working for all builds in all projects.
No changes were made to teamcity or the blob storage.
The generated redirect download link yields the following error every time:
<Code>AuthenticationFailed</Code> <Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:xxxxx-edited-out-xxxxxxxxxxxxx Time:2020-11-03T23:52:59.2000012Z</Message> <AuthenticationErrorDetail>Signed expiry time [Tue, 03 Nov 2020 23:52:45 GMT] must be after signed start time [Tue, 03 Nov 2020 23:52:59 GMT]</AuthenticationErrorDetail>

Artifacts > a couple MB in size fail to upload

Builds with artifacts > than a couple MB in size fail to upload with [AzureArtifactsPublisher] plugin. Buid log just shows the following generic error message:

Failed to publish artifacts: java.io.IOException

The last build I saw this on produced a single 22MB artifact .

Artifacts are not available via Nuget feed

When this plugin is enabled, the artifacts publish great to Azure storage. However those artifacts are not available to be pulled via the nuget feed. Am I missing a configuration?

Add Azure Storage Blob Upload build step

By default, when artifacts are uploaded to Azure Blob Storage, they will be uploaded in the nested path structure:

container/
└── Project
    └── Project_Build
        └── 100
            ├── my-app-1521658753171-43bb3c049af9850c75a9.js
            └── my-app-latest.js

However, I want to upload blobs to a well-known location such as:

container/
├──  my-app-1521658753171-43bb3c049af9850c75a9.js
└──  my-app-latest.js

Is this currently possible? It would also require overwriting any existing files with the same path (e.g. my-app-latest.js)

Moving existing build artifacts

Is there a way (or could it be added as a new feature) to move existing build artifacts into Azure storage?

It would be nice to be able to consolidate all build artifacts (existing as well as new ones) into the same place.

Alternatively, are there manual steps that can be taken to achieve the same result?

Parallel upload to Azure Blob Storage

Our artifact has size around 1GB and it's upload is too slow.
image

At the same time Azure Blob Storage client for Java has special property which enables concurrent requests ConcurrentRequestCount https://docs.microsoft.com/en-us/java/api/com.microsoft.azure.storage.file._file_request_options.setconcurrentrequestcount

I changed file azure-storage-agent/src/main/kotlin/jetbrains/buildServer/artifacts/azure/publish/AzureArtifactsPublisher.kt in function AzureArtifactsPublisher

                    val blobOptions = BlobRequestOptions()
                    blobOptions.concurrentRequestCount = 8
                     FileInputStream(file).use {
                         val length = file.length()
                        blob.upload(it, length, null, blobOptions, null)
                         val artifact = ArtifactDataInstance.create(filePath, length)
                         publishedArtifacts.add(artifact)
                     }

Then I build plugin with Grandle and change zip file on TeamCity agent machine. The result is so good:
image
I would be nice to realize the parallel upload.

Cannot inspect artifacts

Is it possible to allow inspecting of artifacts? It is a very useful feature.
Even tree view would be cool.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.