Power Apps: Canvas Apps Not Bringing New Records

Featured

Hey

If you are facing a problem which is similar to below scenario, then the following proposed solution is for you.

Scenario

  1. User is using your App in his mobile phone or any other device.
  2. During that someone is created a new record in your data-source from Model Driven App (or any other place).
  3. And your Canvas App is not bringing that new record for that user.

Solutions

  1. If your Gallery’s data-source is a “Collection” and you are populating that collection somewhere, then your expressions should be in the following sequence.

    Correct
    Refresh(<data-source>);
    Collect(<collection>, <data-source>);

    Incorrect
    Collect(<collection>, <data-source>);

  2. Otherwise, you need to add a button/icon and write the following formula on button’s “OnSelect” action to refresh your data-source.

    Expression
    Refresh(<data-source>);

Problem is solved!


Stay tuned for more solutions and Keep up the good work.

Thank you!

Can’t Select Records in Polymorphic Lookup? Here’s the Fix!

Polymorphic lookups (like Regarding in Dynamics 365) are great—until you can’t pick records from your custom table or even system tables like Users. Frustrating, right? Here’s a quick checklist to get you back on track:

1. Check Custom JavaScript 🚫

Custom JS might be blocking your lookup. Disable any scripts on the form and test again.

2. Confirm Table is in Your Model-Driven App ✅

Open your app in the Modern App Designer (with Admin access). Is your missing table listed under Tables? If not, add it!


3. Use Classic Designer if Needed 🕰️

If the modern designer won’t let you add that table, switch to the Classic App Designer and add it there.


Quick Recap

  • Polymorphic lookup dropdown depends on your app’s table list.
  • Custom JS can interfere—check first.
  • Missing tables? Add them using the app designer.

Fixing Solution Import Errors After Enabling “Create a New Activity” | Dynamics 365 / Dataverse

When you enable “Create a New Activity” for a custom Dataverse table, the platform automatically creates several activity-related relationships (Task, Email, ActivityPointer, etc.). This works fine in your current environment, but when you export the solution and import it into another environment, you may run into errors like:

“The entity relationship role of the referencing entity is required…”
“The referencing entity must have the CanBeRelatedEntityInRelationship property set to true.”

These errors appear because the solution is missing important metadata connected to the activity relationships that Dataverse generated automatically.

👉 Root Cause

Enabling “Create a New Activity” causes Dataverse to create new 1:N system relationships between your custom table and several Activity tables, such as:

  • ActivityPointer (Activity)
  • Task
  • Email
  • Phone Call
  • Appointment
  • Any other custom activity entity

When exporting your solution, the relationship metadata might be included, but the Activity tables themselves are not — causing the import to fail since the target environment cannot recreate these relationships without the updated metadata.

✅ Fix

To successfully import the solution into another environment, include all components involved in the new activity relationships.

Add these to your solution:

  1. Your custom table metadata (Mandatory)
    • Ensures the “Create a New Activity” setting is included.
  2. All activity tables are involved
    • Activity (ActivityPointer)
    • Task, Email, PhoneCall, Appointment (Optional)
    • Any other activity entity Dataverse linked automatically (Optional)
  3. Auto-generated 1:N activity relationships
    • Example:
      • yourTable_ActivityPointers
      • yourTable_Tasks
      • yourTable_Emails

Once all these pieces are included, the solution imports cleanly across environments.

💡Summary

If you enable “Create a New Activity” on a custom table, Dataverse creates several system-managed activity relationships. These relationships depend on the Activity and Task tables, so you must include:

✔ Your custom table metadata
✔ ActivityPointer + specific activity tables
✔ Auto-generated relationships

Adding these components ensures smooth solution imports and prevents frustrating metadata errors.

Azure APIM configuration for D365 F&O Custom Services

Hey there,

In this blog, I’ll demonstrate the steps and procedures required to configure Azure API Management Service (APIM) instance for D365 F&O custom services to enable monitoring and analytics for service callings and consumptions. Here, I’m assuming that you have already created and deployed your “Service” and “Service Group” in your D365 F&O application.

You can validate that your custom service is up and running by execute the following URL in your browser and it should return a metadata of your service:
“Https://[D365FOURL]/api/services/[YourServiceGroup]/[YourService]”

APIM Service Configuration Steps

1.App Registration

First, we need to create a new App Registration in Azure for our D365 F&O application with client secret. Follow below steps:

  • Login to your Azure Portal with your credentials
  • Navigate to “App Registration”.
  • Click on “+New Registration” to create new App registration.
  • Fill out the following values as per your requirement and hit “Register” button.
FieldsValues
NameGive name to your App registration (i.e. APIM-App-Registration)
Supported account typesSelect account types (i.e. Account in this organizational directory only)
Redirect URLLeave it blank for now (we’ll configure it later)

1.1. API Permission

  • Go to “API Permissions” and add the following delegated permissions.

1.2 Add Certificates & secrets

  • Go to “Certificates & secrets”.
  • Click “+ New client secret” to add new client secret.
  • Fill out the values and hit “Add” button.
  • Please note the generated client secret value somewhere, we’ll need it later.

2. API Management Service

Now we are going to create API Management Service instance to call APIs with monitoring and analytics capabilities.

  • Navigate to “API Management services”.
  • Click on “+New” to create new APIM service instance.
  • Fill out the following values.
FieldsValues
SubscriptionSelect your azure subscription
Resource groupSelect resource group (you can create one as well)
RegionSelect your region (Default East US)
Resource NameEnter Logical Name for your APIM resource (i.e. APIM-Resource)
Organization NameEnter your Organization Name (i.e. Microsoft)
Administrator EmailEnter Administrator Email (i.e. AdministratorEmail@Microsoft.com)
Pricing TierSelect pricing tier (Developer (no SLA) by default) Learn more
  • Click “Review + create” to review APIM settings
  • Click “Create” to create API Management Service (Have patience APIM creation will take around 10-25 minutes)

2.1. Add Product

Once APIM instance is created, we need to add a product in our Azure APIM instance to get subscription keys while sending requests to our API. Follow below steps to add a product in your Azure APIM instance.

  • Open your APIM instance, go to “Products” section.
  • Click “+Add” button to add a new Product.
  • Fill out the following values.
FieldsValues
Display nameGive name to your product (i.e. D365FinsOps)
IdAuto-generated id based on your product display name (i.e. d365finsops)
DescriptionGive description to your product (i.e. Dynamics 365 Finance and Operations)
PublishedPublished (i.e. Yes)
Requires subscriptionRequires subscription (It will create subscription keys for your products to be used later)
Requires approvalRequires approval (i.e. No)
Subscription count limitLeave blank
Legal termsDefine your legal terms

2.2. Add OAuth 2.0 Service

Now we need to create “OAuth 2.0 connection” for our registered App and we’ll use this connection later. Follow below steps to add “OAuth 2.0” service connection in your Azure APIM instance.

  • In your APIM instance, go to “OAuth 2.0 + OpenID Connect” section.
  • Click “+Add” button
  • Fill out the following values and hit “Create” button.
FieldValues
Display nameGive name to your OAuth 2.0 (i.e. APIM-OAuth2-Connection)
IdAuto-generated id based on your display name (i.e. apim-oauth2-connection)
DescriptionGive description to your product (i.e. APIM OAuth2 service connection)
Client registration page URLNavigate to your App Registration that we have created at “Step 1” and Go to “Overview > Endpoints” and Copy “OAuth 2.0 authorization endpoint (v2)” value and past it here (i.e. https://login.microsoftonline.com/organizations/oauth2/v2.0/authorize)
Authorization grant typesSelect authorization grand types (i.e. Authorization code, Client credentials)
Authorization endpoint URLNavigate to your App Registration that we have created at “Step 1” and Go to “Overview > Endpoints” and Copy “OAuth 2.0 authorization endpoint (v2)” value and past it here (i.e. https://login.microsoftonline.com/organizations/oauth2/v2.0/authorize)
Authorization request methodSelect request method (i.e. Get)
Token endpoint URLNavigate to your App Registration that we have created at “Step 1” and Go to “Overview > Endpoints” and Copy “OAuth 2.0 token endpoint (v2)” value and past it here (i.e. https://login.microsoftonline.com/organizations/oauth2/v2.0/token)
Client authentication methodsSelect Authentication methods (i.e. In the body)
Default scopePaste your D365 Fin & Ops application URL with suffix “/.default” (i.e. [D365FOURL]/.default)
Client credentials
(Client ID
)
Navigate to your App Registration that we have created at “Step 1” and Go to “Overview” and Copy “Application (client) ID” value and past it here (i.e. xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)
Client credentials
(Client secret
)
Paste “Client secrets value” that we created at “Step 1.2” (i.e. xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx)

2.3. Add API

Now, this is the main part where we define our custom service and its operations to be called as an API.

  • In your APIM instance, go to “API” tab.
  • Select “+Add API”
  • Click “Blank API”
  • Fill out the following values and hit “Create” button.
FieldValues
Display nameGive name to your API (i.e. Custom Service)
NameAuto-generated name based on your display name (i.e. custom-service)
Web service URLPaste your D365 Fin & Ops application URL in the following format
[D365FOURL]/api/services/[Your Service Group]/[Your Service]
API URL suffixGive APIM URL suffix (i.e. custom-service)
  • Now, select your API “Custom Service”
  • Go to “Settings” and update the following values and hit “Save” button.
FieldValues
TagsDefine your API tags (i.e. D365)
ProductsSelect Product that we have created at “Step 2.1”
GatewaysSelect Gateways (i.e. Managed)
Subscription requiredSubscription required (i.e. Yes)
Header nameDefine header name (i.e. Ocp-Apim-Subscription-Key)
Query parameter nameDefine query parameter name (i.e. subscription-key)
User authorizationSelect OAuth 2.0
OAuth 2.0 serverSelect OAuth 2.0 server that we have created at “Step 2.2” (i.e. APIM-OAuth2-Connection)

2.3.1. Add Operation

Now we add an operation to our API to define our service method.

  • In your APIM instance, go to “API” tab.
  • Select your API “Custom Service”
  • Go to “Design” and Click “+ Add operation” button.
  • Fill out the following values and hit “Save” button.
FieldValues
Display nameGive name to your API (i.e. getTestService)
NameAuto-generated name based on your display name (i.e. gettestservice)
URLGive your FinOps custom service method name here (i.e. Get /getTestService)
DescriptionGive description to your product (i.e. Get customer name)
TagsDefine your API tags (i.e. D365)

2.4. Add Redirect URI

Here we’ll define redirect URIs for our App Registration those will be used by our API to get authorization codes.

  • Navigate to App Registrations and open App that we have registered at “Step 1”
  • Go to “Authentications”.
  • Click on “+ Add a platform” to add “Redirect URIs”
  • Add the following URIs
URI
Https://[APIM instance name].portal.azure-api.net/docs/services/[OAuth2 connection name]/console/oauth2/authorizationcode/callback
Https://[D365FOURL]

2.5. Test API

Though you can test API from your APIM instance, but a better way to test your API is from “Developers Portal”. Let’s see the steps to test your API from Developer Portal.

2.5.1. Publish Developer Portal

For the first time, we need to publish developer portal for our APIM instance.

  • Go to “Portal overview” section.
  • Publish Your Developer Portal.

2.5.2. Open Developer Portal

  • Go to “Overview” section.
  • Under “Properties” tab, go to “Developer portal”.
  • Open Portal in “Administrative interface” mode.
  • Click on “Developer Portal” to access your portal.
  • Navigate to “APIs”, there you should be able to see your Custom API.
  • Click on your API.
  • Select Operation and Click on “Try it”
  • Define following values and hit “Send” button to call your custom service as an API.
SectionFieldValue
HeadersOcp-Apim-TraceIt should be “true”
Ocp-Apim-Subscription-KeyIt’ll be auto-populated from “Subscription key” value under “Authorization” section.
AuthorizationIt’ll be auto-populated from “Access Token” acquired by “APIM-OAuth2-Connection” value under “Authorization” section.
AuthorizationAPIM-OAuth2-ConnectionSelect “Authorization Code” to acquire access token.
Subscription keySelect “Primary subscription key” from your APIM product.
Request URLRequest URLIt will be your APIM URL for this particular API and it will be used by external sources to consume your API. (i.e. Https://[APIM instance name].azure-api.net/custom-service/getTestService)
HTTP requestHTTP requestA complete HTTP request (Here, you can pass parameter values to your service method and set other values in your request)
  • You should receive a response with status “200 Ok” and metadata of your service method in response content.

Conclusion:

You can follow above steps to configure Azure API Management service to introduce monitoring and analytics capabilities between your “D365 F&O custom service” and “your service callers/consumers”.

Your APIM Instance > Monitoring > Analytics > Timeline
Your APIM Instance > Monitoring > Analytics > APIs

In the next part of this session, we’ll see the configuration steps to consume “D365 F&O custom service” from “D365 CE” via APIM service instance.

References:

  1. https://devblog.sertanyaman.com/2019/05/15/empowering-d365-fo-service-endpoints-with-azure-api-management-service/
  2. https://docs.microsoft.com/en-us/azure/api-management/get-started-create-service-instance#create-a-new-service

Power Apps: Exception in Canvas Apps – Object not set to an instance of an object.

Hey there,

Recently, I have encountered the following exception in my Power Apps – Canvas App.

The requested operation is invalid.
Server response: [Datasource name] failed: An error has occurred. Object not set to an instance of an object.
ClientRequestedId: [#GUID]

Canvas App

If you are also facing this exception in your Canvas Apps, then the following proposed solution is for you.

Exception:

Actually, you are facing this exception because you are passing a “null” value to your Canvas App “function” (Filter/LookUp/Search etc.). All you need to do is to check your value before calling a function.

Solution:

To avoid this exception, just make sure you are writing your expressions in a correct way.

Correct:
If(
(!IsBlank(<‘Control Name’>)),

Filter(<‘Datasource name’>, <‘Column name’> = <‘Control Name’>.Text)
)

Incorrect:
Filter(<‘Datasource name’>, <‘Column name’> = <‘Control Name’>.Text)

  1. You should use this approach with all such expressions in your Canvas App.
  2. “<‘Control name’>” inside “IsBlank” method can be replaced with your variable name, if you are using variable for filtration.
  3. Similarly, “Filter” function can be replaced with LookUp, Search, or similar functions, it depends from where you are getting that above error.

I hope it’ll help you.

Thank you!

Power Apps: Data Validation Approach for Canvas Apps Barcode Scanner [Part 2]

Hey there,

In the previous part, we have seen a data validation approach for Canvas App scanner while we were dealing with small data volume. In this part, we’ll see a validation approach that we should take when we are dealing with large data volume.

Scenario: You are validating that the scanned product Id is a valid product Id in your Fin & Ops application. Because, in Fin & Ops we can expect large number of products. So this sort of validation required some extra care to maintain accuracy.

So let’s see that approach with example.

My Canvas App

We are using the same Canvas App that we have used in Part 1 of this series. We have added another scan screen for Product and we’ll perform a validation for a scanned product Id.

“Scanned Product ID” will show a product Id that you’ll scan from a barcode.
“Valid Product” will show “Yes”, if the scanned Product is a valid Product, it will show “No” for invalid Product.

Validation Approach:

To avoid validation inconsistencies while dealing with large data, we should first load the data in a collection and then we should use that collection for validation.

1) On “OnStart” Action of your App, write the following expression.
Clear(<Collection name>);Collect(<Collection name>,<Datasource name>);

2) On “Text” property of “Valid Product” value box, write the following expression.
If(IsBlank(LookUp(<Collection name>, <‘Column name’> = <‘Scanner Control Name’>.Text )), “No”, “Yes” )

That’s it! Though, “OnStart” action will take few seconds to load the data in your collection. But this approach will be accurate and validation will be super fast.

In this series we have learned some simple techniques to accurately validate the data that we are scanning from a Canvas App scanner. We’ll see more Canvas Apps techniques and approaches in our upcoming blogs. So stay tuned !

Thank you!

Reference:

  1. https://abkblogs.home.blog/2021/01/30/power-apps-data-validation-canvas-apps-scanner-part-1/
  2. https://barcode.tec-it.com/en/QRCode?data=D00112

Power Apps: Data Validation Approach for Canvas Apps Barcode Scanner [Part 1]

Hey there,

Adding a barcode scanner in your Canvas App is quite common requirement now and whenever you scan something, you need some sort of validation for the scanned values.

As far as I experienced, validation approaches for Canvas App scanner value are different based on your data volume.

Scenario: You are validating that the scanned Warehouse name is a valid Warehouse in your Fin & Ops application. Because, in Fin & Ops generally we can’t expect large number of warehouses. So this sort of validation required comparatively simple approach.

So let’s see that approach with example.

My Canvas App

To demonstrate this scenario, I have created a simple Canvas App with a scan screen for Warehouse and we’ll perform a validation for a scanned warehouse name.

“Scanned Warehouse Name” will show a warehouse name that you’ll scan from a barcode.
“Valid Warehouse” will show “Yes”, if the scanned warehouse is a valid warehouse, it will show “No” for invalid warehouse.

Validation Approach:

If you are dealing with small data volume like our example, then you can easily use below expression On “Text” property of “Valid Warehouse” control to validate the scanned warehouse name.
If(IsBlank(LookUp(<‘Datasource name’>, <‘Column name’> = <‘Scanner Control Name’>.Text ))), “No”, “Yes” )

It’ll work fine as far as your data volume is small in your data-source (“Warehouses” in our case).

Note: I’m assuming that you’re aware that you can only test barcode scanner in your mobile device as of now.

But if you are dealing with large data volume so you need to take another way to maintain validation accuracy. We’ll see that approach in our next blog.

So stay tuned and Keep up the good work.

Thank you!

References:

  1. https://abkblogs.home.blog/2021/01/30/power-apps-data-validation-canvas-apps-scanner-part-2/
  2. https://powerapps.microsoft.com/en-us/blog/make-barcode-scanning-apps-in-minutes/
  3. https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/get-started-create-from-blank
  4. https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/controls/control-new-barcode-scanner
  5. https://barcode.tec-it.com/en/QRCode?data=Yorkville%20000_016

Flow a custom field from posted payment journal to vendor/customer transaction in Dynamics 365 Finance and Operations.

Hello everyone,

Whenever you add a new custom field to a payment journal, so then it’s become very important to flow that field to the posted transaction tables as well when that journal gets posted.

So today I’ll show you how you can implement a logic to flow a custom field from posted payment journal to a vendor/customer transaction table in D365 F&O using extensions and COC pattern.

VENDOR TRANSACTION

  1. Add a field (xyzExpenseCode) to the extensions of the LedgerJournalTrans table and LedgerJournalTransVendPaym form.
  2. Now add a field (xyzExpenseCode) to the extensions of the VendTrans table and form.
  3. Create an extension of VendVoucher class (VendVoucherxyz_Extension) to wrap the initCustVendTrans method to define the following mapping between journal and vendor transaction fields.

protected void initCustVendTrans(
CustVendTrans _custVendTrans,
LedgerVoucher _ledgerPostingJournal,
boolean _useSubLedger)
{
next initCustVendTrans(_custVendTrans, _ledgerPostingJournal, _useSubLedger);
VendTrans vendTrans = _custVendTrans as VendTrans; // Get vendTrans table buffer form CustVendTrans map instance
If (common == ledgerJournalTrans )
{
LedgerJournalTrans ledgerJournalTrans = common; // Get journalLine table buffer
vendTrans.xyzExpenseCode = ledgerJournalTrans.xyzExpenseCode; // Define mapping for custom fields
}
}

CUSTOMER TRANSACTION

  1. Add a field (xyzExpenseCode) to the extensions of the LedgerJournalTrans table and LedgerJournalTransCustPaym form.
  2. Similarly, add a field (xyzExpenseCode) to the extensions of the CustTrans table and form.
  3. Lastly, create an extension of CustVoucher class (CustVoucherxyz_Extension) to wrap the initCustVendTrans method to define the following mapping between journal and customer transaction fields.

protected void initCustVendTrans(
CustVendTrans _custVendTrans,
LedgerVoucher _ledgerPostingJournal,
boolean _useSubLedger)
{
next initCustVendTrans(_custVendTrans, _ledgerPostingJournal, _useSubLedger);
CustTrans custTrans = _custVendTrans as CustTrans; // Get custTrans table buffer form CustVendTrans map instance
If (common == ledgerJournalTrans )
{
LedgerJournalTrans ledgerJournalTrans = common; // Get journalLine table buffer
custTrans.xyzExpenseCode = ledgerJournalTrans.xyzExpenseCode; // Define mapping for custom fields
}
}

Now when you post a payment journal, so along with all other fields your custom field will also move to the respective transaction tables.

Happy Daxing,

References:

https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/extensibility/method-wrapping-coc

How to add a new bank cheque format in Dynamics 365 Finance and Operations using extensions and handlers?

Hello everyone,

In this session, I’ll share my knowledge that how can we add a new cheque format in D365 F&O through extensions and handlers.

Mandatory Steps:

  • Open Visual studio and create a new Dynamics365 project.
  • Now, create a report (xyzCheque_US) to design it as per your requirement (you can also duplicate any existing cheque report and make the required changes like I did).
Design it as per requirement
  • Create a new output type Menu-item (xyzCheque_US) and set the following properties.
Menu-item
Menu-item properties
  • Create an extension of base enum CheckFormType (CheckFormType.xyzExtension) to add a new option (xyzUSStyle) for your cheque format.
Extended Enum

Cash and bank management > bank accounts > bank accounts > layout > Check > Check form
  • Now we need to create some classes to execute this newly created cheque format.

1.Create xyzBankChequePrintHandler class and add a following handler for a provided delegate.

[SubscribesTo(classStr(BankChequePrint), delegateStr(BankChequePrint, printDocumentDelegate))]
public static void determineReportMenuOutput(ChequeFormType _chequeFormType, EventHandlerResult _eventResult)
{
if (_chequeFormType == ChequeFormType::xyzUSStyle)
{
_eventResult.result(menuItemOutputStr(xyzCheque_US));
}
}

2. Create xyzChequeControllerHandler class and add a following handler.

[SubscribesTo(classStr(ChequeController), delegateStr(ChequeController, initChequeReportFromChequeFormTypeDelegate))]
public static void initChequeReportFromChequeFormType(ChequeFormType _chequeFormType, BankChequeLayout _bankChequeLayout, EventHandlerResult _eventResult)
{
if (_chequeFormType == ChequeFormType:: xyzUSStyle)
{
_eventResult.result(ssrsReportStr(xyzCheque_US, Report));
}
}

3. Create xyzCustVendChequeTxtCalHandler class and add a following handler.

[SubscribesTo(classStr(CustVendChequeSlipTextCalculator), delegateStr(CustVendChequeSlipTextCalculator, getChequeDocLengthDelegate))]
public static void getChequeDocLength(ChequeFormType _chequeFormType, EventHandlerResult _eventResult)
{
const real ChequeSizeDefault = 88.89; // define as per your requirement
if (_chequeFormType == ChequeFormType::xyzUSStyle)
{
_eventResult.result(ChequeSizeDefault);
}
}

4. Create an extension class BankPrintTestChequexyz_Extension of BankPrintTestCheque class and add the following two methods (here we will be using COC pattern to execute our logic).

protected boolean canRunChequePrintForExtendedFormType(TmpChequePrintout _tmpChequePrintout)
{
boolean isExtended = next canRunChequePrintForExtendedFormType(_tmpChequePrintout);
isExtended = _tmpChequePrintout.ChequeFormType == ChequeFormType::xyzUSStyle;
return isExtended;
}

protected void runChequePrintForExtendedChequeFormType(TmpChequePrintout _tmpChequePrintout)
{
next runChequePrintForExtendedChequeFormType(_tmpChequePrintout);
if (_tmpChequePrintout.ChequeFormType == ChequeFormType::xyzUSStyle)
{
Args args = this.getChequePrintArgs(_tmpChequePrintout);
new MenuFunction(menuitemOutputStr(xyzCheque_US), MenuItemType::Output).run(args);
}
}

Recommended Steps:

Although below steps are optional but these are recommended to avoid BP errors and to follow best practices.

  • Create a new Privilege (xyzBankChequeGenerate) and drag your menu-item (xyzCheque_US) to its Entry points to set the required security for this menu-item.
  • Create an Extensions of PaymVendorPaymentsMaintain and BankBankTransactionsMaintain duties and add new privilege (xyzBankChequeGenerate) to these existing duties.

Now, deploy your report, run your project and test this newly created cheque format by following this link.

Thanks and have a nice day.

References: