Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

what is page fault how can i reduce it

Status
Not open for further replies.

sagar474

Full Member level 5
Joined
Oct 18, 2009
Messages
285
Helped
5
Reputation
10
Reaction score
5
Trophy points
1,318
Location
India,kakinada
Activity points
3,122
I'm being reported that my program contains more then 1000 page faults
actually what are they and how can i reduce them ?

can i reduce them by doing some modifications in my code?
 

I'm being reported that my program contains more then 1000 page faults
actually what are they and how can i reduce them ?

can i reduce them by doing some modifications in my code?

Buy more RAM for your system. Your system's MMU is working overtime. Is your program using shared memory?


Contrary to what the name 'page fault' might suggest, page faults are not errors and are common and necessary to increase the amount of memory available to programs in any operating system that utilizes virtual memory, including Microsoft Windows, Unix-like systems


Page Fault
 
  • Like
Reactions: sagar474 and FvM

    FvM

    Points: 2
    Helpful Answer Positive Rating

    sagar474

    Points: 2
    Helpful Answer Positive Rating
Code:
#include <vector>
using namespace std;


#include <stdio.h>
int main(int argc, char* argv[])
{
register unsigned  int size=0;
register unsigned  int u, v;
FILE *file;
vector<unsigned  int> arr;//vector to store unsigned  integers from file

//-----file read operation------
file=fopen(argv[1],"r");


    while(!feof(file))
        {
        ++size;
        arr.resize(size);
        fscanf(file,"%u\n",&arr[size-1]);

        }
//------end of file read operation---

//-------actual calucation-----------
vector<unsigned  int> b;

vector<unsigned  int> p(arr.size());

  if (!(arr.empty()))
  {

	{

	b.push_back(0);

	for (size_t i = 1; i < arr.size(); i++)
        {

		if (arr[b.back()] < arr[i])
                {
			p[i] = b.back();
			b.push_back(i);
			continue;
		}
		for (u = 0, v = b.size()-1; u < v;)
                {
			unsigned  int c = (u + v) / 2;
			if (arr[b[c]] < arr[i]) u=c+1; else v=c;
		}

		if (arr[i] < arr[b[u]])
                {
			if (u > 0) p[i] = b[u-1];
			b[u] = i;
		}
	if(i>arr.size())break;
	}

	for (u = b.size(), v = b.back(); u--; v = p[v]) b[u] = v;
 }
}
//--------end of actual calucation----

//-----printing output---------
printf("%u\n", b.size());

	return 0;
}

this is my program how can i modify it to reduce page faults.
 

this is my program how can i modify it to reduce page faults.

You are not using shared memory between processes/applications and C# takes care of most garbage collection. So the only apparent cause of excessive page faults would appear to be the use of the vectors.

What size files are you loading into the vectors? How much RAM do you have in your machine?

Page faults can typically indicate low memory conditions, the MMU is busy scrounging for memory.

If the files are excessively large you may want to process them is smaller chunks.

BigDog
 

the file contains more up to 1600000 +ve ints

and the report is

Number of major page faults 0 to 0
Number of minor page faults 268 to 5440

roughly how much time dose it consume with these page faults?
 

the file contains more up to 1600000 +ve ints

Yes, this is why your app is causing the high number of minor page faults.

roughly how much time dose it consume with these page faults?

That largely depends on your system and its resources. You could roughly estimate it by processing a smaller file which represents a fraction of the larger version. If you timed the app's runtime to process both the original file and a smaller file, then multiply the time of the smaller file by the appropriate percentage to represent the larger file, you would then a rough idea of the time required by the page faults.

And obviously I do mean rough.

BigDog
 
I totally replaced vectors with arrays, and my execution speed is doubled and page faults reduced.

and til I cant understand that why my program taking more then 53MB memory to process while i am processing only 1600000*4=6.4Mb of ints. even if i make a duplicate copy of it it is not possible to take more then 20MB including instructions used to process them.
 

The use of a vector does come with additional overhead. How much, I do not know.

The fopen() may also create a buffer dependent on the size of the file.

I believe Visual Studio has a profiler which should be able to give you more answers.

BigDog
 

if fopen() allocates memory can we use it to store integers?
hece I can reduce the use of realloc() thousands of times.
generally text files does not contain headers then how fopen knows the size of the file?

sorry I'm using ubuntu linux hance I cant go through visual studio.
 

if fopen() allocates memory can we use it to store integers?
generally text files does not contain headers then how open knows the size of the file?

sorry I'm using ubuntu linux hance I cant go through visual studio.

Is this Mono? I thought it look a little strange.

Of course, the exact implementation of fopen() is compiler dependent, however Plauger's The Standard C Library states it is fully buffered.

You may want to take a look at the source and see exactly how it is written.

You may want to also look around for an open source profiler, I'm sure one exists.

BigDog
 

Depends on the memory requirements of you program, the memory resources of your system and what other processes are running in your OS.

If it is a very simple program, executed on a freshly rebooted system with plenty of RAM, sure.

Something like you app, probably not. There is a reason they are classified as Minor Page Faults, it's a regular occurrence to some extent.

When you see Major Page Faults then you need to worry.

BigDog
 

well, i think your problem is misuse of STL vector(s). trouble with vector is that it reallocates memory (even if you don't do that explicit) so it can achieve continuous memory layout for its elements.each time when it loses space, vector will take new, bigger slice (i think twice as original)

if you intend to use vectors it is best practice to 'guess' how much memory you will need and to reserve it (by calling reserve()). there is a reason for this behaviour and it lies in memory fragmentation - vectors causes very little fragmentation since they are using contignuos block of memory. with lists, queues, etc. this is not the case. again, you are safe with vector if you 'predict' (or even better know exact) number of elements. trivial example: if you parse file containing integers, you can obtain file size and divide it with sizeof(int) to see how much ints you have

hope it helps
 

well, i think your problem is misuse of STL vector(s). trouble with vector is that it reallocates memory (even if you don't do that explicit) so it can achieve continuous memory layout for its elements.each time when it loses space, vector will take new, bigger slice (i think twice as original)
I replaced vectors with arrays. hence i reduced page faults to half.
if you intend to use vectors it is best practice to 'guess' how much memory you will need and to reserve it (by calling reserve()). there is a reason for this behaviour and it lies in memory fragmentation - vectors causes very little fragmentation since they are using contignuos block of memory. with lists, queues, etc. this is not the case. again, you are safe with vector if you 'predict' (or even better know exact) number of elements. trivial example: if you parse file containing integers, you can obtain file size and divide it with sizeof(int) to see how much ints you have

how can i find file size for text files. since text files will not have headers.
if i use fseek i it will be equal to read the file twice.
 

how can i find file size for text files. since text files will not have headers.
if i use fseek i it will be equal to read the file twice.

Why not create a pseudo header at the beginning of the text, number of long integers in text file, then use this value to calculate size of file?

BigDog
 

Code:
#if defined(WIN32)

    #pragma warning (disable: 4290)    // A function is declared using exception specification, which Visual C++ accepts but does not implement.
    #pragma warning (disable: 4996)    // The POSIX name for this item is deprecated

    typedef INT8    int8_t;
    typedef UINT8   uint8_t;
    typedef INT16   int16_t;
    typedef UINT16  uint16_t;
    typedef INT32   int32_t;
    typedef UINT32  uint32_t;
    typedef INT64   int64_t;
    typedef UINT64  uint64_t;
#else

const uint64_t GetFileSize(const char* pszPath)
{
    if ( (NULL == pszPath) || (0 == strlen(pszPath)) )
        throw exception("Invalid path!");

    uint64_t nSize = 0;

#ifdef WIN32

    HANDLE hFile = ::CreateFileA(pszPath, 0, FILE_SHARE_READ|FILE_SHARE_WRITE, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);

    if (INVALID_HANDLE_VALUE == hFile)
        exception("Could not open file");

    try
    {
        BY_HANDLE_FILE_INFORMATION fiInfo;
        if (!::GetFileInformationByHandle(hFile, &fiInfo))
            exception("GetFileInformationByHandle failed");

        nSize = (((uint64_t)fiInfo.nFileSizeHigh) << 32) + (uint64_t)fiInfo.nFileSizeLow;
    }
    catch (...)
    {
        ::CloseHandle(hFile);
        throw;
    }

    ::CloseHandle(hFile);

#else

    struct stat st;
    if (0 != stat(pszPath, &st))
        exception("stat failed");

    nSize = static_cast<uint64_t>(st.st_size);

#endif

    return nSize;
}
i did not add proper includes, and it has linux/unix 'branch' which you can throw away if you are on windows only. i modified original code by using std::exception instead of CException derived from std::exception to make things simpler
 
Last edited:

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top