Skip to content

Year: 2024

ApiCompat has Moved into the .NET SDK

I while ago I wrote about how to detect breaking changes in .NET using Microsoft.DotNet.ApiCompat. Since then ApiCompat has moved into the .NET SDK.

What has Changed?

Since ApiCompat now is part of the .NET SDK, the Arcade package feed doesn’t need to be referenced anymore.

<PropertyGroup>
  <RestoreAdditionalProjectSources>
    https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-eng/nuget/v3/index.json;
  </RestoreAdditionalProjectSources>
</PropertyGroup>

The package reference can also be removed.

<ItemGroup>
  <PackageReference Include="Microsoft.DotNet.ApiCompat" Version="7.0.0-beta.22115.2" PrivateAssets="All" />
</ItemGroup>

In order to continue running assembly validation from MSBuild, install the Microsoft.DotNet.ApiCompat.Task.

<ItemGroup>
  <PackageReference Include="Microsoft.DotNet.ApiCompat.Task" Version="8.0.404" PrivateAssets="all" IsImplicitlyDefined="true" />
</ItemGroup>

Enable Assembly validation.

<PropertyGroup>
  <ApiCompatValidateAssemblies>true</ApiCompatValidateAssemblies>
</PropertyGroup>

The contract assembly reference directive has changed, so the old directive needs to be replaced.

<ItemGroup>
  <ResolvedMatchingContract Include="LastMajorVersionBinary/lib/$(TargetFramework)/$(AssemblyName).dll" />
</ItemGroup>

<PropertyGroup>   
  <ApiCompatContractAssembly> LastMajorVersionBinary/lib/$(TargetFramework)/$(AssemblyName).dll
  </ApiCompatContractAssembly>
</PropertyGroup>

The property controlling suppressing breaking changes, BaselineAllAPICompatError, has changed to ApiCompatGenerateSuppressionFile.

<PropertyGroup>
  <BaselineAllAPICompatError>false</BaselineAllAPICompatError>
  <ApiCompatGenerateSuppressionFile>false </ApiCompatGenerateSuppressionFile>
</PropertyGroup>

That’s it, your good to go!

Compatibility Baseline / Suppression directives

Previously the suppression file, ApiCompatBaseline.txt, contained text directives describing suppressed compatibility issues.

Compat issues with assembly Kafka.Protocol:
TypesMustExist : Type 'Kafka.Protocol.ConsumerGroupHeartbeatRequest.Assignor' does not exist in the implementation but it does exist in the contract.

This format has changed to an XML based format, written by default to a file called CompatibilitySuppressions.xml.

<?xml version="1.0" encoding="utf-8"?>
<Suppressions xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <Suppression>
    <DiagnosticId>CP0001</DiagnosticId>
    <Target> T:Kafka.Protocol.CreateTopicsRequest.CreatableTopic.CreateableTopicConfig
    </Target>
    <Left>
LastMajorVersionBinary/lib/netstandard2.1/Kafka.Protocol.dll
    </Left>
    <Right>obj\Debug\netstandard2.1\Kafka.Protocol.dll</Right>
  </Suppression>
</Suppressions>

This format is more verbose than the old format, a bit more difficult to read from a human perspective if you ask me. The description of the various DiagnosticIds can be found in this list.

Path Separator Mismatches

Looking at the suppression example, you might notice that the suppressions contain references to the compared assembly and the baseline contract. It’s not a coincidence that the path separators differs between the references to the contract assembly and the assembly being compared. The Left reference is a templated copy of the ApiCompatContractAssembly directive using OS agnostic forward slashes, but the Right directive is generated by ApiCompat and it is not OS agnostic, hence the backslash path separators generated when executing under Windows. If ApiCompat is executed under Linux it would generate front slash path separators.

You might also notice that the reference to the assembly being compared contains the build configuration name. This might not match the build configuration name used during a build pipeline for example (Debug vs Release).

Both these differences in path reference will make ApiCompat ignore the suppressions when not matched. There is no documentation on how to consolidate these, but fortunately there are a couple of somewhat hidden transformation directives which can help control how these paths are formatted.

<PropertyGroup>
  <_ApiCompatCaptureGroupPattern>
.+%5C$([System.IO.Path]::DirectorySeparatorChar)(.+)%5C$([System.IO.Path]::DirectorySeparatorChar)(.+)
  </_ApiCompatCaptureGroupPattern>
</PropertyGroup>

<ItemGroup>
  <!-- Make sure the Right suppression directive is OS-agnostic and disregards configuration -->
  <ApiCompatRightAssembliesTransformationPattern Include="$(_ApiCompatCaptureGroupPattern)" ReplacementString="obj/$1/$2" />
</ItemGroup>

The _ApiCompatCaptureGroupPattern regex directive captures path segment groups which can be used in the ApiCompatRightAssembliesTransformationPattern directive to rewrite the assembly reference path to something that is compatible to both Linux and Windows, and removes the build configuration segment.

Using this will cause the Right directive to change accordingly.

<?xml version="1.0" encoding="utf-8"?>
<Suppressions xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <Suppression>
    <DiagnosticId>CP0001</DiagnosticId>
    <Target>
T:Kafka.Protocol.CreateTopicsRequest.CreatableTopic.CreateableTopicConfig
    </Target>
    <Left>
LastMajorVersionBinary/lib/netstandard2.1/Kafka.Protocol.dll
    </Left>
    <Right>obj/netstandard2.1/Kafka.Protocol.dll</Right>
  </Suppression>
</Suppressions>

There is a similar directive for the Left directive named ApiCompatLeftAssembliesTransformationPattern.

Leave a Comment

Parsing OpenAPI Style Parameters

Parameters in the OpenAPI 3.1 specification can be defined in two ways using a JSON schema; either by using a media type object, which is useful when the parameter is complex to describe, or by using styles, which is more common in simple scenarios, which often is the case with HTTP parameters.

Let’s have a look at the styles defined and where they fit into a HTTP request.

Path Parameters

Path parameters are parameters defined in a path, for example id in the path /user/{id}. Path parameters can be described using label, matrix or simple styles, which are all defined by RFC6570, URI Template.

Here are some examples using the JSON primitive value 1 for a parameter named id :

  • Simple: /user/1
  • Matrix: /user/;id=1
  • Label: /user/.1

It’s also possible to describe arrays. Using the same parameter as above with two JSON primitive values, 1 and 2, it get’s serialized as:

  • Simple: /users/1,2
  • Matrix: /user/;id=1,2
  • Label: /user/.1.2

Given a JSON object for a parameter named user with the value, { "id": 1, "name": "foo" }, it becomes:

  • Simple: /user/id,1,name,foo
  • Matrix: /user/;user=id,1,name,foo
  • Label: /user/.id.1.name.foo

The explode modifier can be used to enforce composite values (name/value pairs). For primitive values this has no effect, neither for label and simple arrays. With the above examples and styles where explode has effect, here’s the equivalent:

Arrays

  • Matrix: /user/;id=1;id=2

Objects

  • Simple: /user/id=1,name=foo
  • Matrix: /user/;id=1;name=foo
  • Label: /user/.id=1.name=foo

Query Parameters

Query parameters can be described with form, space delimited, pipe delimited or deep object styles. The form style is defined by RFC6570, the rest are defined by OpenAPI.

Primitives

Using the example from path parameters, a serialized user, ?user={user}, or user id, ?id={id}, defined as a query parameter value would look like:

  • Form: id=1

Note that the examples doesn’t describe any primitive values for pipe and space delimited styles, even though they are quite similar to the simple style.

Arrays

  • Form: id=1,2
  • SpaceDelimited: id=1%202
  • PipeDelimited: id=1|2

Objects

  • Form: user=id,1,name,foo
  • SpaceDelimited: user=id%201%20name%20foo
  • PipeDelimited: user=id|1|name|foo

Note that the examples lack the parameter name for array and objects, this has been corrected in 3.1.1.

Not defining explode for deepObject style is not applicable, and like path styles, explode doesn’t have effect on primitive values.

Exploded pipe and space delimited parameters are not described in the example, even though they are similar to form. Do note though that neither of them would be possible to parse, as the parameter name cannot be inferred.

With all this in mind here are the respective explode examples:

Arrays

  • Form: id=1&id=2
  • SpaceDelimited: id=1%202
  • PipeDelimited: id=1|2

Objects

  • Form: id=1&name=foo
  • SpaceDelimited: id=1%20name=foo
  • PipeDelimited: id=1|name=foo
  • DeepObject: user[id]=1&user[name]=foo

Header Parameters

Header parameter can only be described using the simple style.

Given the header user: {user} and id: {id}, a respective header parameter value with simple style would look like:

Primitives

  • Simple: 1

Arrays

  • Simple: 1,2

Objects

  • Simple: id,1,name,foo

Similar to the other parameters described, explode with primitive values have no effect, neither for arrays. For objects it would look like:

  • Simple: id=1,name=foo

Cookie Parameters

A cookie parameter can only be described with form style, and is represented in a similar way as query parameters. Using the example Cookie: id={id} and Cookie: user={user} a cookie parameter value would look like:

Primitives

  • Form: id=1

Arrays

  • Form: id=1,2

Objects

  • Form: user=id,1,name,foo

Similar to the other parameters described, explode with primitive values have no effect. For arrays and objects it looks like:

Arrays

  • Form: id=1&id=2

Objects

  • Form: id=1&name=foo

Note that exploded objects for cookie parameters have the same problem as query parameters; the parameter name cannot be inferred.

Object Complexity

Theoretically an object can have an endless deep property structure, where each property are objects that also have properties that are objects and so on. RFC6570 nor OpenAPI 3.1 defines how deep a structure can be, but it would be difficult to define array items and object properties as objects in most styles.

As OpenAPI provides media type objects as a complement for complex parameters, it’s advisable to use those instead in such scenarios.

OpenAPI.ParameterStyleParsers

To support parsing and serialization of style defined parameters, I’ve created a .NET library, OpenAPI.ParameterStyleParsers. It parses style serialized parameters into the corresponding JSON instance and vice versa. It supports all examples defined in the OpenAPI 3.1 specification, corrected according to the inconsistencies described earlier. It only supports arrays with primitive item values and objects with primitive property values, for more complex scenarios use media type objects. The JSON instance type the parameter get’s parsed according to is determined by the schema type keyword. If no type information is defined it falls back on a best effort guess based on the parameter style.

Leave a Comment

ConfigureAwait or Not

I often get into the discussion about should you disable continuing on the captured context when awaiting a task or not, so I’m going to write down some of my reasoning around this rather complex functionality. Let’s start with some fundamentals first.

async/await

Tasks wrap operations and schedules them using a task scheduler. The default scheduler in .NET schedules tasks on the ThreadPool. async and await are syntactic sugar telling the compiler to generate a state machine that can keep track of the state of the task. It iterates forward by keeping track of the current awaiter’s completion state and the current location of execution. If the awaiter is completed it continues forward to the next awaiter, otherwise it asks the current awaiter to schedule the continuation.

If you are interested in deep-diving what actually happens with async and await, I recommend this very detailed article by Stephen Toub.

What is the Synchronization Context?

Different components have different models on how scheduling of operations need to be synchronized, this is where the synchronization context comes into play. The default SynchronizationContext synchronizes operations on the thread pool, while others might use other means.

The most common awaiters, like those implemented for Task and ValueTask, considers the current SynchronizationContext for scheduling an operation’s continuation. When the state machine moves forward executing an operation it goes via the current awaiter which when not completed might schedule the state machines current continuation via SynchronizationContext.Post.

ConfigureAwait

The only thing this method actually does is wrap the current awaiter with it’s single argument, continueOnCapturedContext . This argument tells the awaiter to use any configured custom synchronization context or task scheduler when scheduling the continuation. Turned off, i.e. ConfigureAwait(false), it simply bypasses them and schedules on the default scheduler, i.e. the thread pool. If there are no custom synchronization context or scheduler ConfigureAwait becomes a no-op. Same thing apply if the awaiter doesn’t need queueing when the state machines reaches the awaitable, i.e. the task has already completed.

Continue on Captured Context or not?

If you know there is a context that any continuation must run on, for example a UI thread, then yes, the continuation must be configured to capture the current context. As this is the default behavior it’s not technically required to declare this, but by explicitly configure the continuation it sends a signal to the next developer that here a continuation is important. If configured implicitly there won’t be anything hinting that it wasn’t just a mistake to leave it out, or that the continuation was ever considered or understood.

In the most common scenario though the current context is not relevant. We can declare that by explicitly state the continuation doesn’t need to run on any captured context, i.e. ConfigureAwait(false).

Enforcing ConfigureAwait

Since configuring the continuation is not required, it’s easy to miss configuring it. Fortunately there is a Roslyn analyzer that can be enabled to enforce that all awaiters have been configured.

Summary

Always declare ConfigureAwait to show intent that the continuation behavior has explicitly been considered. Only continue on a captured context if there is a good reason for doing so, otherwise reap the benefits of executing on the thread pool.

Leave a Comment