error Archives - ISbyR https://isbyr.com/tag/error/ Infrequent Smarts by Reshetnikov Sat, 09 Aug 2025 03:06:46 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.4 n8n – The response was filtered due to the prompt triggering Azure OpenAI’s content management policy https://isbyr.com/n8n-the-response-was-filtered-due-to-the-prompt-triggering-azure-openais-content-management-policy/ https://isbyr.com/n8n-the-response-was-filtered-due-to-the-prompt-triggering-azure-openais-content-management-policy/#respond Wed, 28 May 2025 13:28:59 +0000 https://isbyr.com/?p=1215 I started playing with n8n.io, specifically with the “My first AI Agent in n8n” workflow that comes OOTB. I didn’t have OpenAI subscription, but I do have an Azure subscription and Azure OpenAI deployment to play with, so I replaced the “standard” OpenAI node with the Azure OpenAI one. But when I started the execution, … Continue reading n8n – The response was filtered due to the prompt triggering Azure OpenAI’s content management policy

The post n8n – The response was filtered due to the prompt triggering Azure OpenAI’s content management policy appeared first on ISbyR.

]]>
I started playing with n8n.io, specifically with the “My first AI Agent in n8n” workflow that comes OOTB.

I didn’t have OpenAI subscription, but I do have an Azure subscription and Azure OpenAI deployment to play with, so I replaced the “standard” OpenAI node with the Azure OpenAI one.

But when I started the execution, the Azure OpenAI Chat Model node threw an exception, straight in my face: “The response was filtered due to the prompt triggering Azure OpenAI’s content management policy.”.

The Problem

The summary of the error was not too informative, to be honest.

But, if you expand error details, you can see where the actual problem is:

The thing with Azure OpenAI (or other AI models served by the Azure AI Foundry, for that matter) is that all the requests are going through Azure Guardrails, like Content Filters and Blocklists. And the default content filter decided that the prompt that the n8n Agent node was trying to run was too “fishy”. Look, TBH, I can’t blame it for that, as when you peek under the hood (of the prompt that is sent to the LLM), you can see it is “screaming” at it with commands like ----- IGNORE BELOW -----, which can easily be perceived as a jailbreak attempt.

The Solution

So, what do you do if something default doesn’t work?! You customise it! And Azure AI content filters are not an exception, and are very easy to customise:

  • Go to Azure AI Foundry and make sure that you are in the right project of course.
  • Click the Guardrails + Controls on the left side panel.
  • Select the Content filters tab.
  • Click the Create content filter button to start the custom content filter wizard.
    • Provide a name for your content filter on the Basic information page.
    • The Input filter page is the one where we need to make the changes. Find the Prompt shields for jalbreak attacks category and set the action to either Annotate only or Off. (Selecting Annotate only runs the respective model and returns annotations via API response, but it will not filter content).
  • Next, Next to get to the Connection step.
  • Here, you will select the deployment that you want to apply this content filter to.
  • Hit Next and then Replace in the Replace existing content filter dialogue box.

And that’s it. Next time I executed this step in n8n, it ran successfully.

NOTE: Of course, guardrails in general, and content filters specifically, exist for a very good reason. So you should be very careful when tweaking them or turning them off. You should always consider who will have access to this inference endpoint and what data is accessible to it.

But, since I was playing with it in my personal environment, I didn’t mind making these tweaks to the content filter.

More posts related to my AI journey:

The post n8n – The response was filtered due to the prompt triggering Azure OpenAI’s content management policy appeared first on ISbyR.

]]>
https://isbyr.com/n8n-the-response-was-filtered-due-to-the-prompt-triggering-azure-openais-content-management-policy/feed/ 0
“Create a Custom Skill for Azure AI Search” lab fails https://isbyr.com/create-a-custom-skill-for-azure-ai-search-lab-fails/ https://isbyr.com/create-a-custom-skill-for-azure-ai-search-lab-fails/#respond Sat, 21 Dec 2024 13:57:30 +0000 https://isbyr.com/?p=1208 I tried to follow the “Create a Custom Skill for Azure AI Search” but it failed with this error “The request is invalid. Details: The property ‘includeTypelessEntities’ does not exist on type ‘Microsoft.Skills.Text.V3.EntityRecognitionSkill’. Make sure to only use property names that are defined by the type.” If you try to follow the “Create a Custom … Continue reading “Create a Custom Skill for Azure AI Search” lab fails

The post “Create a Custom Skill for Azure AI Search” lab fails appeared first on ISbyR.

]]>
I tried to follow the “Create a Custom Skill for Azure AI Search” but it failed with this error “The request is invalid. Details: The property ‘includeTypelessEntities’ does not exist on type ‘Microsoft.Skills.Text.V3.EntityRecognitionSkill’. Make sure to only use property names that are defined by the type.”

If you try to follow the “Create a Custom Skill for Azure AI Search” lab that is part of the “Implement knowledge mining with Azure AI Search” course it fails (at least until Microsoft updates the lab files as per my PR).

There are 2 issues in the update-skillset.json file that is part of this lab

Issue and Error #1

The request is invalid. Details: The property ‘includeTypelessEntities’ does not exist on type ‘Microsoft.Skills.Text.V3.EntityRecognitionSkill’. Make sure to only use property names that are defined by the type.

The request is invalid. Details: The property 'includeTypelessEntities' does not exist on type 'Microsoft.Skills.Text.V3.EntityRecognitionSkill'. Make sure to only use property names that are defined by the type.

To fix it remove the line that contains includeTypelessEntities in the JSON file mentioned above.

This is due to a depreciation of this parameter by Microsoft

Issue and Error #2

If you try to run the update-skillset script again, after fixing the first error, you will be greeted by error #2:

“One or more skills are invalid. Details: Error in skill ‘#1’: Outputs are not supported by skill: entities”

To fix it, a few lines below the line that you’ve just removed under the outputs section, replace

"name": "entities"

with

"name": "namedEntities"

Since that is the available output name of this API.

That’s it folks, enjoy

More posts related to my AI journey:

The post “Create a Custom Skill for Azure AI Search” lab fails appeared first on ISbyR.

]]>
https://isbyr.com/create-a-custom-skill-for-azure-ai-search-lab-fails/feed/ 0
Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK https://isbyr.com/getting-imageanalysisresultdetails-in-azure-ai-vision-python-sdk/ https://isbyr.com/getting-imageanalysisresultdetails-in-azure-ai-vision-python-sdk/#respond Fri, 05 Jan 2024 12:29:25 +0000 https://isbyr.com/?p=1113 Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK. Sometimes when using Azure AI Python SDK you will not get the expected result, meaning that the reason property of the result of the analyze method of the ImageAnalyzer class the property will not be equal to sdk.ImageAnalysisResultReason.ANALYZED. Phew, that’s a mouthful, easier to show it code: … Continue reading Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK

The post Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK appeared first on ISbyR.

]]>
Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK.

Sometimes when using Azure AI Python SDK you will not get the expected result, meaning that the reason property of the result of the analyze method of the ImageAnalyzer class the property will not be equal to sdk.ImageAnalysisResultReason.ANALYZED.

Phew, that’s a mouthful, easier to show it code:

...
image_analyzer = sdk.ImageAnalyzer(cv_client, image, analysis_options)

result = image_analyzer.analyze()
    
if result.reason == sdk.ImageAnalysisResultReason.ANALYZED:
...

The condition in the last line will not be true.

So you would like to actually see what was it

print(f'ResultReason = {result.reason}')

That will give us the reason

Well that’s not too useful, is it?

Let’s get the actual error behind the reason

result_details = sdk.ImageAnalysisResultDetails.from_result(result)
print(f'Result Details = {result_details.json_result}')

And voila: No free soup Analyze Operation under Computer Vision API for you.

More posts related to my AI journey:

The post Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK appeared first on ISbyR.

]]>
https://isbyr.com/getting-imageanalysisresultdetails-in-azure-ai-vision-python-sdk/feed/ 0
Azure: Invalid user storage id or storage type is not supported https://isbyr.com/azure-invalid-user-storage-id-or-storage-type-is-not-supported/ https://isbyr.com/azure-invalid-user-storage-id-or-storage-type-is-not-supported/#comments Wed, 27 Dec 2023 05:20:55 +0000 https://isbyr.com/?p=1086 I was trying to update my Azure Language service to enable Custom text classification / Custom Named Entity Recognition. That feature requires a storage account. While you are supposed to be able to create the storage account when you enable the feature it didn’t work for me 🙁 (I was getting an “Invalid user storage … Continue reading Azure: Invalid user storage id or storage type is not supported

The post Azure: Invalid user storage id or storage type is not supported appeared first on ISbyR.

]]>
I was trying to update my Azure Language service to enable Custom text classification / Custom Named Entity Recognition. That feature requires a storage account. While you are supposed to be able to create the storage account when you enable the feature it didn’t work for me 🙁 (I was getting an “Invalid user storage id or storage type is not supported” error).

Problem 1: “Invalid user storage id or storage type is not supported”

As part of learning a bit about Azure AI Services, I was doing a Classify Text exercise, but since it is one of a few prior exercises I was not creating a new Language service from scratch, but rather re-using an existing one. So I needed to enable the Custom text classification…. feature.

Azure Language Service Features page

I would click the Create a new storage account link and fill in all the details. and will click Apply on the Azure Language service Features page.

But, it would almost immediately error out with an “Invalid user storage id or storage type is not supported” message

Invalid user storage id or storage type is not supported

Solution for: “Invalid user storage id or storage type is not supported”

The solution was simple: create a new Azure Storage Account and then select it from the drop-down list (or use one of the existing ones).

Problem 2: blob containers are not visible when creating a new project

Next, when I was trying to create the Custom text single label classification project I was supposed to select a container (from an already pre-filled storage account), but it wasn’t visible.

Solution for: blob containers are not visible when creating a new project

Make sure that the managed identity has the necessary permissions.

In Azure Portal go to your storage account > Access Management (IAM).

Click Add > Add role assignment

In the Role section search for and select Storage Blob Data Owner

Under the Members section, select Managed Identity and click Select members.

Then on the right select the correct subscription, Language, and the correct resource.

Select, apply, etc. until the role is assigned and you will be able to pick up the blob container in the Azure Language services wizard that is used to create a new project.

Problem 3: “A server error occurred. Please refresh the page and try again”

After going through the wizard, it looks like the project is finally created, but when you click it, the following (very informative 😕) error pops up on the right.

Solution for: “A server error occurred. Please refresh the page and try again”

Add CORS for Language service endpoint to the Azure Storage account.

In Azure Portal go to your storage account > Resource Sharing (CORS)

Fill in:

  • Allowed origins: https://language.cognitive.azure.com
  • Allowed methods: DELETE, GET, PUT
  • Allowed headers: *
  • Max age: 500

Click Save

You might need to recreate the Custom Text Classificaion project either via UI or using a REST call like the one below:

curl -X PATCH  https://<YOU_LANGUAGE_SERVICE_URL_PREFIX>.cognitiveservices.azure.com/language/authoring/analyze-text/projects/<PROJECT_NAME>?api-version=2022-05-01 -H 'Content-Type: application/json' -H 'Accept: application/json' -H 'Ocp-Apim-Subscription-Key: <YOUR_LANGUAGE_SERVICE_API_KEY>' -d '{"projectName": "<PROJECT_NAME>", "language": "en-us", "projectKind": "CustomSingleLabelClassification", "description": "<PROJECT_DESCRIPTION>", "multilingual": false, "storageInputContainerName": "<BLOB_CONTAINER_NAME>"}'

P.S.

I probably could have avoided all these problems, if I had just created a new Azure language service from scratch, following the training, instead of re-using an existing Azure language service….., but then what would I be writing here instead? 🤔

More posts related to my AI journey:

The post Azure: Invalid user storage id or storage type is not supported appeared first on ISbyR.

]]>
https://isbyr.com/azure-invalid-user-storage-id-or-storage-type-is-not-supported/feed/ 2
AWS VolumeModificationSizeLimitExceeded https://isbyr.com/aws-volumemodificationsizelimitexceeded/ https://isbyr.com/aws-volumemodificationsizelimitexceeded/#respond Mon, 09 Dec 2019 06:01:42 +0000 http://isbyr.com/?p=571 If you are dealing with big amounts of EBS volume and need to extend it you might face one day an AWS VolumeModificationSizeLimitExceeded error. We are occasionally extending our EBS volumes, It is done by updating the CloudFormation for these instances and then there is a script (that was written by people much more AWS knowledgable … Continue reading AWS VolumeModificationSizeLimitExceeded

The post AWS VolumeModificationSizeLimitExceeded appeared first on ISbyR.

]]>
If you are dealing with big amounts of EBS volume and need to extend it you might face one day an AWS VolumeModificationSizeLimitExceeded error.

We are occasionally extending our EBS volumes, It is done by updating the CloudFormation for these instances and then there is a script (that was written by people much more AWS knowledgable than me) that checks CloudFormation parameters and if it sees that volume size has increased it will do all the AWS and Linux “black magic” to actually extend the volumes and make the OS aware of it.

One happy day we were extending volumes for our 60 members strong fleet of EC2 instances from 2000GB to 3000GB.  So I’ve updated the CloudFormation and the extension kicked off.. On 51 out of 60 instances it has completed successfully, but on the rest, I could see that cfn-script was complaining “An error occurred (VolumeModificationSizeLimitExceeded) when calling the ModifyVolume operation: You have exceeded your maximum gp2 volume modification storage limit of 100 TiB in this region. Please contact AWS Support to request an Elastic Block Store volume modification storage limit increase or retry once existing volume modifications have completed”

Apparently, there is a limit on the “original” total volumes size that can be extended in one go and the default limit is 100TB and which we have hit:  52 x 2000GB = 101.56TB.

It is mentioned in https://docs.aws.amazon.com/general/latest/gr/aws_service_limits.html as “Maximum modifying storage”

VolumeModificationSizeLimitExceeded
Maximum Modifying Storage

(As of December 2019) I couldn’t find where one can see the current limit and the only way to increase it is by logging a support case with AWS. By the way, you will need to log a “generic” support case and not a “limit increase” case as this limit is not available for selection.

Also if you plan to extend from more than 300TB in one go you better contact AWS as soon as possible as here is what I’ve got from AWS  Support when trying to increase the value to 400TB

“With regards to your question, the 300TB is the value we can provide at this point because request with higher value requires further review/approval from our EBS team. The outcome depends on their review and we don’t guarantee whether the request will get approved or not.”

P.S. you’ve reached the maximum modification rate per volume limit. wait at least 6 hours between modifications per ebs volume.

The “you’ve reached the maximum modification rate per volume limit. wait at least 6 hours between modifications per ebs volume.” error is different and happens when you are trying to increase an EBS volume again within less than 6 hours from the previous time you’ve modified the same volume.

Update October 2022

These limits are now better documented at: https://docs.aws.amazon.com/general/latest/gr/ebs-service.html and it seems that one can request the limit increase using a dedicated “limit increase” case.

VOLUMEMODIFICATIONSIZELIMITEXCEEDED

The post AWS VolumeModificationSizeLimitExceeded appeared first on ISbyR.

]]>
https://isbyr.com/aws-volumemodificationsizelimitexceeded/feed/ 0