1. sanping peng
  2. .NET DataStore
  3. Thursday, 13 January 2022 18:05 PM UTC

I am preparing a project base on .net datastore,after download .net datastore sample code and run it,memory increased fast when refresh page base on .net datastore,the more compute column,the more memory increased,why? why unmanaged memory increased?because of dynamic compile?is there some cache strategy to avoid it?

several times refreshed,about 691m memory used. 

when opening new page,memory increased rapidly.

thanks.

Logan Liu @Appeon Accepted Answer Pending Moderation
  1. Friday, 14 January 2022 10:31 AM UTC
  2. .NET DataStore
  3. # 1

Hi Sanping,

Please note that .NET DataStore is 100% managed code C# runtime so it benefits from the full power of the Common Language Runtime (CLR), such as built-in security, faster performance, and ease of deployment. Since the GC will automatically deal with the release of the memory for the objects that are not referenced according to its optimal strategy, it’s not recommended to release the memory by ourselves. To know more about .NET Automatic Memory Management, refer to: https://docs.microsoft.com/en-us/dotnet/standard/automatic-memory-management

It seems that you are running a VS debugger to debug a “debug build” application, which will also increase the memory of the process. Refer to:
https://docs.microsoft.com/en-us/visualstudio/profiling/running-profiling-tools-with-or-without-the-debugger?view=vs-2019

This .NET DataStore Example was mainly designed to demonstrate the functionality of .NET DataStore and it has not been specifically optimized for performance. When opening each page in the PB application and loading data for the first time, many .NET DataStore objects are newly created to retrieve many data rows data from the database, then generate JSON data packages then respond to PB Application. It needs to cost much memory during these steps. If you are interested in the performance of .NET DataStore, you can also experience it with the help of some professional .NET performance testing frameworks, E.g.: https://github.com/dotnet/BenchmarkDotNet.

Regards,
Logan

Comment
  1. Miguel Leeuwe
  2. Friday, 14 January 2022 11:47 AM UTC
Hi Logan,

Though I understand that the examples are just to "show us some specific way to do things", it would be very nice if they could also show us good practices to optimize performance and memory usage. Be aware that many of us are 'noobs' in C# programming and we would literally copy code from the examples for our production applications.

Just a thought,

regards,

MiguelL
  1. Helpful
There are no comments made yet.
sanping peng Accepted Answer Pending Moderation
  1. Friday, 14 January 2022 13:15 PM UTC
  2. .NET DataStore
  3. # 2

Hi Logan,Miguel

    Thanks for your reply.

    After compare with page(same logic as sample page) base on EFCore and ADO.NET,.Net Datastore memory increased faster and used more.

    As Miguel said,Can appeon show us some good practices to optimize performance and memory usage?

    Professional comparation or best practices can encourage more developers join .Net DataStore.

   Thanks.

    

Comment
There are no comments made yet.
sanping peng Accepted Answer Pending Moderation
  1. Friday, 14 January 2022 13:32 PM UTC
  2. .NET DataStore
  3. # 3

after tracing and analysis,dynamic compile may cause memory leak(unmanaged memory),if dynamic compile result is not cached correctly,this will cause system breakdown,so best practice to optimize performance and memory usage should be provided to avoid developer's worry.

thanks.

Comment
There are no comments made yet.
Logan Liu @Appeon Accepted Answer Pending Moderation
  1. Sunday, 16 January 2022 09:45 AM UTC
  2. .NET DataStore
  3. # 4

Hi Sanping,

I guess your current question is mainly "why more memory increased and not released immediately in an ASP.NET Core Web API project".  You can also see the memory also increased fast when using other ORMs to retrieve a lost of data in a Web API project. If not, please provide your code sample here. 

The .NET Garbage Collector has two different modes (https://docs.microsoft.com/en-us/aspnet/core/performance/memory?view=aspnetcore-3.1#workstation-gc-vs-server-gc):

  • Workstation GC: Optimized for the desktop. Workstation GC is designed for desktop applications to minimize the time spent in GC. In this case GC will happen more frequently but with shorter pauses in application threads. 
  • Server GC. The default GC for ASP.NET Core apps. Optimized for the server. Server GC is optimized for application throughput in favor of longer GC pauses. Memory consumption will be higher, but application can process greater volume of data without triggering garbage collection.

The GC mode can be set explicitly in the project file of the .NET DataStore sales demo. You can set ServerGarbageCollection to Workstation GC mode in the project file to see whether it works like what you expected:

<PropertyGroup>

  <ServerGarbageCollection>false</ServerGarbageCollection>

</PropertyGroup>

Start this .NET DataStore sales demo again, you will see that process will only cost about 100~150 MB memory. It can at least prove that there is no memory leak in this demo.

Please note that Workstation GC mode is usually not recommended when running this application on the server. On a typical web server environment, CPU usage is more important than memory, therefore the Server GC mode is better. If memory utilization is high and CPU usage is relatively low, the Workstation GC might be more performant. For example, high density hosting several web apps where memory is scarce.

Regards, Logan

Comment
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.