Large amounts of data, or many visual components in your user interface may at some point affect the performance of your Apps and Services. In this section we will provide some tips on how to optimize the performance, and which pitfalls you should try to avoid.
App Data is loaded before the App is displayed to the end user. All Database Connected Data Sources are loaded in parallel, unless one Data Source is filtered on another Data Source - in this case, these two Data Sources are loaded in sequence.
If the App has an "On App Load" action (configured in the App Settings), this action will be executed after App Data is loaded, before the initial view is displayed to the end user.
A Database Connected Data Source will load instantly while using the App, as soon as Appfarm detects that the filter conditions of a Data Source have changed.
When App Data is loaded, Appfarm detects which properties are in use in any UI or action. Only those properties will be read. This also applies to Deep Data Bindings:
Person.Company.Nameis a Deep Data Binding. In fact, this will create a technical Generated Data Source holding the Person ID, Company ID and Company Name. Such Generated Data Sources are not anything you as a developer or end user should care about, but it's relevant to know that they exist - since a lot of Deep Data Bindings on Data Sources with many objects, will create a lot of data and Data Sources being read on app load. This will also mean a lot of Data Sources for Appfarm to maintain while using the App.
Note that since all Objects and Properties in use is read upon App Load, any deploy to Test or Prod with changes affecting which data to be read, will be a "hard deploy". In other words, the user will be presented a "New deploy dialog" and must refresh the App. In the Development environment, you need to manually refresh since the Client in Development has no such deploy dialog enabled.
Typical feedback from users might be that the "App is running slow". You should start by asking questions about whether the slowness is related to startup of the App, a specific view, or specific actions.
However, in order to do some analysis yourself, here's a few tips on where the bottlenecks might be. You should open the console log, and open the App in Production or Staging environment for best analysis.
The startup time of the App is slow
As mentioned in the previous section, all initial loading of data happens before the App is displayed. You may see in the log and measure the time from a refresh until the App load action is executed (or the initial view is displayed). Long startup times (+5 seconds) is caused by either a large amount of data read, or a lot of function properties calculated on large Data Sources. See section Performance Tuning App Data, specifically Limit the amount of data being read and Limit the use of runtime properties with functions on Data Sources.
Actions with Set Selection, Read Objects, Update Objects or Create Objects is running slow
Look for the following log entries in the Console Log
- "Resolve Selection Dependencies" for <DATA SOURCE>: This means that the <DATA SOURCE> is filtered on Selected Object(s) in the Data Source where the Set Selection is performed. Consider a broader Data Source filter, and apply UI filters instead.
- "Resolve Side Effects": For example, a Read Objects, where Data Sources in App Data is filtered on the data read. Consider a broader Data Source filter, and apply UI filters instead.
- "Run Full Formula Recalculation" on <DATA SOURCE>: The <DATA SOURCE> has runtime only properties of type Function, and this Function is dependent on the operation performed. The runtime property is recalculated for all entries of the <DATA SOURCE>. For example: The Data Source
Ordershas a runtime only function property
Number of Order Lines. Creating a new Order Line would force all Orders to recalculate their property
Number of Order Lines, since each Order do not know if the Order Line created belongs to them. Solution: Remove this function property and apply it in UI instead, or Denormalize Data.
In general: Runtime Only properties of Function type is not a good practice to add on large Data Sources, at least not when they use other Data Sources as function parameters.
Navigating to a specific view is slow
When navigating to a view, the UI of the view is calculated and initialized before it is actually displayed to the user. If the view has a lot of data, and a lot of property conditions, functions etc to be evaluated, loading the UI may be slow. Logging of what happens when drawing the UI is by default not logged to the console log, so you may conclude that a "Switch View" running slowly without any logging is caused by a lot of UI evaluations.
You may enable logging of UI: write the following in the console log:
The solution to slow UI may be many. Typically, it is either too much data (reduce the amount of data displayed, e.g. reduce the number of rows displayed by default), or there are too many property conditions, visibility conditions or enabled/read-only conditions.
This could be caused by temporary issues such as an infrastructural issue. In this case, the symptoms will disappear in a while. If the sluggishness continues on a daily basis, you should inspect the console log.
A typical pitfall is to have "Subscribe to updates" on too many or too large Data Sources. You may see this in the console log by a vast amount of "Replace Data" or "Insert Data" entries, even when you are not doing anything in the App. The "Replace Data" entries in the log mean that a whole Data Source is refreshed / re-read from the database, typically due to the setup of the filter or runtime properties of the Data Source (changes to the dependent data give a full refresh of the Data Source). Please see section Reduce the number of Data Sources with "Subscribe to Updates" below.
This section will summarize some of the measures you may take for slow-running app loads or data updates.
Database Connected Data Sources could have a filter limiting the amount of data. For example: If your app has a Data Source
Read allon the
OrdersData Source works fine for a while. But over time, it may contain vast amounts of data.
- Add an App Variables:
Order From Date, with default value e.g. 1 year ago. Filter the Data Source on
Order Date >= App Variables.Order From Date, and let the user have the possibility to change the Order From Date
- Add another Data Source: Order history. It may be Runtime Only. Add a Button etc to your UI: View order history, and apply a Read Objects (from database) with filter
Order Date < App Variables.Order From Datewhen this button is clicked. Display the Order history in a dedicated view/dialog
Sometimes, developers tend to create multiple Database Connected Data Sources, holding the same data.
- Limit the number of Data Sources holding the same data. You may use a single Data Source in many cases, and apply various filters on it in UI or in Actions
Some data are rarely used in the App, for example, the change history of a Company. This is not core functionality, and those data may be read on demand.
- Set the flag
Initially disabledon these data sources. When navigating to the views displaying these data, use Action Node
Set Data Source Attributesto enable the Data Source. When a Data Source is disabled, no data is read, and data cannot be updated towards it. But once enabled, it behaves as a normal Database Connected Data Source
- Optionally, use Runtime Only Data Source. You may change a Data Source from being Database Connected to Runtime Only - you only need to make sure to apply Action Node
Read objectwhen the data is needed. When operating on Runtime Only Data Sources, you also need to apply Action Node
Persist Dataafter updates have been done and you want to save these updates to the Database.
Data Sources may have Runtime Only properties calculated by functions. Those functions will be calculated for each instance in the Data Source upon app load, or whenever the criteria for the function is changed.
Instead, you could move those functions to the UI. For instance, you have a table listing all "My" Orders. The Data Source "Orders" holds all orders. Instead of having a function property
Order amount fx(with function
return units*price) on the Data Source, you could add a table column with a value calculated with the same function. In this case, the function will only be calculated for the rows of the table once the table is displayed.
The Console Log (when debugging your App) may reveal this issue. If you find any entries such as
Run full Formula Recalculation: <DATA-SOURCE-NAME>straight after either a Create Object, Update Object, Persist or Set Selection operation: This means that the operation results in one or more runtime properties (with function) in <DATA-SOURCE-NAME> (e.g. "Orders (all)") being recalculated (for all records in <DATA-SOURCE-NAME>).
Orders (all)has a runtime property
No of Order Lines, being calculated by adding
Order Lines (all)as a function parameter, filtering it, and returning its length. Somewhere in your logic, you Persist a new Order Line. Since the Data Source
Orders (all)do not know if this Order Line is "owned by them", all entries will recalculate its property
No of Order Lines.
The solution if the
Run full Formula Recalculationhas a long execution time (e.g. > 500ms) is to remove the runtime property causing this, and move this logic somewhere else (such as calculating this as a function in your UI instead).
Note that in some cases, a function property is used to e.g. return a property of a referenced object. For example: You have a runtime function property
Customer Nameon the Data Source
Contacts, with a function
return Contact.Company.Name. This could still be used, but can be optimized using Reference Data Sources. See the next section.
In some cases, you could need to reduce the number of deep data bindings. Reference Data Sources, a setting on Data Sources, may be used instead.
Example: You have a table listing all
Contacts, and one of the table columns are
Contacts.Company.Name. if the Contacts Data Source is large, in combination with many Data Source and lots of data, this deep data binding could be avoided to improve performance.
In the above example, we already have a Data Source
Customersholding all Customers, and the Customer Name is thus already read into the App.
Solution, with reference to the above example: On the Contacts Data Source, add a Reference Data Source to the Customers Data Source. This is explained here (scroll to the Reference Data Sources section).
Note that Reference Data Source is only beneficial in the above example if the
CustomersData Source is something you already use and need in your App. You should not add the
CustomersData Source only for usage as a Reference Data Source. Appfarm will always be able to retrieve the value of
Contacts.Company.Namewithout a Reference Data Source.
Tip/Info: When deep data bindings are used, Appfarm generates
GENERATEDdata sources (for the joins towards the connected object classes). You may see these generated Data Sources in the Dev Tool if you open the Console in the browser and type the following:
For example, if you have Data Source
Ordersholding 10.000 orders, and another Data Source
Order lineswith filter
Order lines.Order exists in Orders- this will be slow. This is a large query with 10.000 entries in the "exists in"-part of the database query.
Try instead to apply alternative filters.
A Data Source may have the setting "Subscribe to Updates" - a very nice feature since you will basically never have to refresh the App.
However, if the Data Source with this setting has frequent updates on a large amount of objects, the App may appear sluggish when the instances of this Data Source are refreshed. This happens asynchronously, but it affects the overall performance of the App.
Solution 1: Reduce the amount of Data Sources with Subscribe to Updates.
Note that if
Data Source Ahas Subscribe to Updates, and
Data Source B(of the same object type) is filtered on
Data Source A(e.g.
Customer (selected)) - you do not need to have Subscribe to Updates on Data Source B.
You should also consider the use case for whether a Data Source needs Subscribe to Updates. Example: A Data Source
Orders (for selected customer)filters all Orders for the selected Customer, and this Data Source is refreshed every time you open a new Customer. This Data Source would not need Subscribe to Updates, since it will refresh very often due to the Filter of the Data Source.
Solution 2: Reduce the number of large updates, causing a vast amount of traffic to the Apps
Example: A Service runs every 10 minutes, polling an external system for the Order Status for all undelivered Orders. 500 objects is processed in every execution. An easy way to do this is to iterate all 500 Orders. For each iteration, run a web request, and update the Order (Update Object) with the received values whatsoever, and also set a property "Last Synced Date" for track record. However, introducing an "IF the received values differ from the values already stored on the Order" (i.e action node IF) prior to the Update Object, you would only update Orders that have actually changed. Thus reducing the amount of data refreshed to the Apps through Subscribe to Updates.
Example: You have an Object hierarchy Company, Orders, Order lines. You only want to load Orders and Order Lines for your Company (Person (logged in).Company). Order lines have no property Company, and you apply the following filters:
Company: Company.ID equals Person (logged in).Company Orders: Orders.Company equals Company Order lines: Order lines.Order exists in Orders
Order lines is the issue here. Note also that these 3 Data Sources are read in sequence. If
Person.Companyis changed, all 3 Data Sources will be re-read from the database.
Solution: Denormalize the data. You could add a property
Companyon the Order lines Data Source. You just need to make sure that the
Order Lines.Companyis stored when new Order Lines are created. With this property, you may apply the following filters instead:
Company: Company.ID equals Person (logged in).Company Orders: Orders.Company equals Company Order lines: Order lines.Company equals Company
With this solution, you will avoid the large "exists in" query, and both Orders and Order lines are populated in parallel. In addition, you would access
Order lines.Company.Namewhen displaying Company Name in your Order lines table, instead of using
Order lines.Order.Company.Name- reducing the path of the deep data binding.
Example: You have a CRM system with Companies and Contacts. Data Source Contacts has filter
Contact.Company equals Companies (selected). Your UI has a list of companies, for you to quickly select a Company (selection is set) to display Contacts (and more) in a pane to the right.
Every click in the list results in the Data Source Contacts to be read from the server. Read from server requires some milliseconds roundtrip, as well as some bytes of data to be transferred. The selection in the list could appear sluggish to the end user, especially if there are many Data Source filtered towards the selected Company.
Solution: Apply a "wide" filter on the Contacts Data Source (holding all Contacts for all Companies) and apply a filter in the UI instead: The right pane in your UI could filter only those Contacts belonging to the selected Company.
Sometimes, when creating complex User Interfaces, the loading of the User Interface itself may be slow. Typically when creating repeating constructions with lots of elements and a lot of property conditions on each element.
Enable Paginationoption on tables. A table should never display more than 100 records on a single page.
If you have created a custom list (a Container with Repeat Data), you could tick
Enable Virtualizationon the repeating Container. The child of this repeating container should then have a fixed height (e.g. 80px) and on the repeating container, you set
Item Sizeto 80. The setting
Overscanmeans how many items to draw outside the visible window.
Example: A repeating container resulting in 1000 rows. Without Virtualization, the HTML (DOM-tree) for 1000 rows will be drawn even though not visible until you scroll. With Virtualization, only the visible rows + overscan will be drawn. The rest will be drawn when scrolling, resulting in a bit slower scrolling.
When executing action, updating data that impacts the UI (e.g. updating properties used in property conditions in the visible UI), the UI will re-evaluate / re-draw while the action is executing. To avoid this, two options exist. One of them is to enable a Loader on the main container of your UI.
To enable a loader, create an App Variable
Loading(bool). On the container, create a data binding to this variable in the
Show Loadersetting. The action should update the variable to true in the start, and to false in the end.
This is the other option to Enable loader (as explained above). Action has a setting
Pause Render. Tick this option if you want UI updates to wait until the Action has finished execution.
Sometimes when e.g. performing batch updates, the execution of actions could be the bottleneck.
Update Objects may be used to update many objects at once, with individual calculations of updated values.
Example: You have an Order Lines Data Source, with properties
Price. For each Order Line, you want to update another property
The sub-optimal approach is to use a Foreach Action Node, and update each object (in context) in each iteration with a function.
Instead, you could use a single
Update Objectwith function
return units * price. You can simply add
Order Lines.Unitsas function parameters, and these function parameters and the return value of the function will be evaluated individually for each record in the Data Source.
Example: You want to loop all Customers, and for each of them, create a new Activity. If you use Action Node
Create Objecttowards the Database Connected Data Source
Activities, a round trip is made to the server in each iteration. Each iteration creates and saved an Activity towards the database.
Instead, add a Runtime Only Data Source
Activities (temp)with cardinality Many. Create Objects into this Data Source in each iteration. After the For Each, the Activities (temp) Data Source will hold all objects created, but they have not been saved yet. You just need to add a
Persist Data SourceAction Node after the Foreach, and all created Activities will be saved at once - with only 1 roundtrip to the server.
Example: You have a Data Source
Companiesand another Data Source
Contact.Company equals Company (selected)
Your action has a Foreach iterating Companies, with a Set selection (object in context) inside: Each Set Selection will result in re-filtering the Contacts Data Source (and a round-trip to the server). Instead, if you need the selected company, use a single-cardinality Runtime Only Data Source instead, and read the company object in context into the Runtime Only Data Source.
In general, all Service Data is populated with every execution of a Service Interface, before the action is executed.
A few tips when working with Services (in addition to the above)
- Use Runtime Only Data Sources, also for reading data. This gives you more control over the performance, since all Database Connected Data Sources are automatically populated every time a Service Endpoint is triggered.
- When updating or creating data in iterations, create or update into a Runtime Only Data Source. Then perform a Persist Data after the iterations. Note: In many cases, such as logging that an order confirmation has been sent, should rather be saved (persisted) in each iteration to make sure important information is not lost if the service action fails before the iteration finishes.
- Limit the amount of data read, especially for Service Endpoints being trigged often. Typically when processing large amounts of data, you should seek to use Limit Object Count when performing Read Objects, where the result may be thousands or tens of thousands of objects.
- Create rather many services instead of 1 Service with many Service Endpoints.
- Use setting Skip Function Properties on data sources with function properties defined in the Global Data Model. This will avoid calculating these properties when reading large amounts of objects in Services.
The data size of an app is the sum of the size of all its views, actions, and data sources. Blocks of code, comments, and other strings defined within these building blocks will naturally also contribute to the size.
The current size limit in Appfarm allows for quite large apps. However, should you hit this limit for some reason, there are some steps you can take to reduce the overall size:
- Remove all unused elements. Use App Health to identify such elements. Don't use the app as a backup dump. Remember that the size of the app will impact resource usage and runtime performance both server-side and client-side, as well as bandwidth usage. Therefore, eliminating unused elements will benefit both you and your users.
- Remove redundant elements. Audit the containers in your views – often there are a lot of unnecessary containers in use.
- Review your app with business logic architecture in mind. If your app still reaches the limit after you have removed unused and redundant elements, chances are that your app is doing too much. Can you split it into multiple apps? Splitting into smaller apps will reduce the overall complexity of the solution, spread the load better, and increase the performance for end users.