Deleting duplicate .xml files from the ViewDrop folder

D

Don

Okay, here is my situation:

I have a plan that takes 3 hours to process through the ViewDrop
folder. while that is processing, the users continue to add .xml files
to the folder for processing. A large majority of the files are
identical except for the time stamp. Below is an example of one of the
..xml files.

What I am wondering is, if there ar 5 files that are the same (except
for the time stamp) that have not been processed, would there be a
problem with deleting 4 of the files and only having the 1 remaining
file process. I am assuming that this would work because all of the
update information is in the databse not the .xml file.

What I want to eliminate is processing the same plan 5 time at, lets
say, 10 minutes for each .xml (50 minutes) and only process 1 for for
10 minutes.

If this 1 file is processed, all updates that have been made to the
project will be processed correct?

- <ViewsDropData>
- <ProjectData>
- <EntProjectName>
- <![CDATA[ EMP435_21 IM MedPtD_BTI.Published
]]>
</EntProjectName>
<EntProjectID>305</EntProjectID>
</ProjectData>
<OrgGuid>projectserver</OrgGuid>
<ProjectID>336</ProjectID>
<BasePath>projectserver</BasePath>
<TimeStamp>2006-06-08T12:02:47</TimeStamp>
</ViewsDropData>

Thanks for your input.

Don
 
D

Don

Any opinions on this. Just looking to verify my logic. As it stands
now I delete all files out of the folder and let people republish. I
wanted to write a script to automatically remove duplicate .xml files
so I can save time.

Thanks
 
D

Don

I ran several tests with one of our Project Managers and found that
this is possible. The PM bublished severl different kinds of publishes
and multiples of the same publish for a single plan. After several
..xml files accumulated in the ViewDrop folder I deleted all but 1 of
the files then started the viewnotify service to process the one
remaining .xml file.

All changes that had been made to the project were uspdated when the
single .xml was processed.

This is good to know if you have a person that is publishing many times
and clogging up the ViewDrop folder. All but 1 of the .xml's can be
deleted and all updates will still take place.
Any opinions on this. Just looking to verify my logic. As it stands
now I delete all files out of the folder and let people republish. I
wanted to write a script to automatically remove duplicate .xml files
so I can save time.

Thanks
Okay, here is my situation:

I have a plan that takes 3 hours to process through the ViewDrop
folder. while that is processing, the users continue to add .xml files
to the folder for processing. A large majority of the files are
identical except for the time stamp. Below is an example of one of the
.xml files.

What I am wondering is, if there ar 5 files that are the same (except
for the time stamp) that have not been processed, would there be a
problem with deleting 4 of the files and only having the 1 remaining
file process. I am assuming that this would work because all of the
update information is in the databse not the .xml file.

What I want to eliminate is processing the same plan 5 time at, lets
say, 10 minutes for each .xml (50 minutes) and only process 1 for for
10 minutes.

If this 1 file is processed, all updates that have been made to the
project will be processed correct?

- <ViewsDropData>
- <ProjectData>
- <EntProjectName>
- <![CDATA[ EMP435_21 IM MedPtD_BTI.Published
]]>
</EntProjectName>
<EntProjectID>305</EntProjectID>
</ProjectData>
<OrgGuid>projectserver</OrgGuid>
<ProjectID>336</ProjectID>
<BasePath>projectserver</BasePath>
<TimeStamp>2006-06-08T12:02:47</TimeStamp>
</ViewsDropData>

Thanks for your input.

Don
 
G

Gary L. Chefetz [MVP]

Don:

Your concern should be why the XML files are accumulating. Are these
enormous projects? Also, you should check to see if your users have turned
on publish on every save along with the autosave feature. Enabling these
together can generate an absurd amount of XML drops and drive server
performance down.




Don said:
I ran several tests with one of our Project Managers and found that
this is possible. The PM bublished severl different kinds of publishes
and multiples of the same publish for a single plan. After several
.xml files accumulated in the ViewDrop folder I deleted all but 1 of
the files then started the viewnotify service to process the one
remaining .xml file.

All changes that had been made to the project were uspdated when the
single .xml was processed.

This is good to know if you have a person that is publishing many times
and clogging up the ViewDrop folder. All but 1 of the .xml's can be
deleted and all updates will still take place.
Any opinions on this. Just looking to verify my logic. As it stands
now I delete all files out of the folder and let people republish. I
wanted to write a script to automatically remove duplicate .xml files
so I can save time.

Thanks
Okay, here is my situation:

I have a plan that takes 3 hours to process through the ViewDrop
folder. while that is processing, the users continue to add .xml files
to the folder for processing. A large majority of the files are
identical except for the time stamp. Below is an example of one of the
.xml files.

What I am wondering is, if there ar 5 files that are the same (except
for the time stamp) that have not been processed, would there be a
problem with deleting 4 of the files and only having the 1 remaining
file process. I am assuming that this would work because all of the
update information is in the databse not the .xml file.

What I want to eliminate is processing the same plan 5 time at, lets
say, 10 minutes for each .xml (50 minutes) and only process 1 for for
10 minutes.

If this 1 file is processed, all updates that have been made to the
project will be processed correct?

- <ViewsDropData>
- <ProjectData>
- <EntProjectName>
- <![CDATA[ EMP435_21 IM MedPtD_BTI.Published
]]>
</EntProjectName>
<EntProjectID>305</EntProjectID>
</ProjectData>
<OrgGuid>projectserver</OrgGuid>
<ProjectID>336</ProjectID>
<BasePath>projectserver</BasePath>
<TimeStamp>2006-06-08T12:02:47</TimeStamp>
</ViewsDropData>

Thanks for your input.

Don
 
D

Don

The problem stems from a couple of plans that are way to large. These
plans are taking up to 3 hours to process. While they are processing,
nothing else gets processed. The project managers then get impatient
because they don't see the changes that they made and republish again.
They keep doing this untill their plan finally gets published and they
see the changes. What it comes down to is a training issue with the
business people, but the business side doesn't want to change the way
they work so they think everything should be fixed on the server side.

I will also chek into the publish on every save and autosave features.
Is there any way to disable those at the server level?

Thank you for your comments.

Don
Don:

Your concern should be why the XML files are accumulating. Are these
enormous projects? Also, you should check to see if your users have turned
on publish on every save along with the autosave feature. Enabling these
together can generate an absurd amount of XML drops and drive server
performance down.




Don said:
I ran several tests with one of our Project Managers and found that
this is possible. The PM bublished severl different kinds of publishes
and multiples of the same publish for a single plan. After several
.xml files accumulated in the ViewDrop folder I deleted all but 1 of
the files then started the viewnotify service to process the one
remaining .xml file.

All changes that had been made to the project were uspdated when the
single .xml was processed.

This is good to know if you have a person that is publishing many times
and clogging up the ViewDrop folder. All but 1 of the .xml's can be
deleted and all updates will still take place.
Any opinions on this. Just looking to verify my logic. As it stands
now I delete all files out of the folder and let people republish. I
wanted to write a script to automatically remove duplicate .xml files
so I can save time.

Thanks

Don wrote:
Okay, here is my situation:

I have a plan that takes 3 hours to process through the ViewDrop
folder. while that is processing, the users continue to add .xml files
to the folder for processing. A large majority of the files are
identical except for the time stamp. Below is an example of one of the
.xml files.

What I am wondering is, if there ar 5 files that are the same (except
for the time stamp) that have not been processed, would there be a
problem with deleting 4 of the files and only having the 1 remaining
file process. I am assuming that this would work because all of the
update information is in the databse not the .xml file.

What I want to eliminate is processing the same plan 5 time at, lets
say, 10 minutes for each .xml (50 minutes) and only process 1 for for
10 minutes.

If this 1 file is processed, all updates that have been made to the
project will be processed correct?

- <ViewsDropData>
- <ProjectData>
- <EntProjectName>
- <![CDATA[ EMP435_21 IM MedPtD_BTI.Published
]]>
</EntProjectName>
<EntProjectID>305</EntProjectID>
</ProjectData>
<OrgGuid>projectserver</OrgGuid>
<ProjectID>336</ProjectID>
<BasePath>projectserver</BasePath>
<TimeStamp>2006-06-08T12:02:47</TimeStamp>
</ViewsDropData>

Thanks for your input.

Don
 
G

Gary L. Chefetz [MVP]

Those options are set in the local mpt. - Are these way-too-large projects
old? Are they suffering from file bloat maybe? Have a look at this one:

http://www.projectserverexperts.com/Shared Documents/BinaryRebuild.htm




Don said:
The problem stems from a couple of plans that are way to large. These
plans are taking up to 3 hours to process. While they are processing,
nothing else gets processed. The project managers then get impatient
because they don't see the changes that they made and republish again.
They keep doing this untill their plan finally gets published and they
see the changes. What it comes down to is a training issue with the
business people, but the business side doesn't want to change the way
they work so they think everything should be fixed on the server side.

I will also chek into the publish on every save and autosave features.
Is there any way to disable those at the server level?

Thank you for your comments.

Don
Don:

Your concern should be why the XML files are accumulating. Are these
enormous projects? Also, you should check to see if your users have
turned
on publish on every save along with the autosave feature. Enabling these
together can generate an absurd amount of XML drops and drive server
performance down.




Don said:
I ran several tests with one of our Project Managers and found that
this is possible. The PM bublished severl different kinds of publishes
and multiples of the same publish for a single plan. After several
.xml files accumulated in the ViewDrop folder I deleted all but 1 of
the files then started the viewnotify service to process the one
remaining .xml file.

All changes that had been made to the project were uspdated when the
single .xml was processed.

This is good to know if you have a person that is publishing many times
and clogging up the ViewDrop folder. All but 1 of the .xml's can be
deleted and all updates will still take place.

Don wrote:
Any opinions on this. Just looking to verify my logic. As it stands
now I delete all files out of the folder and let people republish. I
wanted to write a script to automatically remove duplicate .xml files
so I can save time.

Thanks

Don wrote:
Okay, here is my situation:

I have a plan that takes 3 hours to process through the ViewDrop
folder. while that is processing, the users continue to add .xml
files
to the folder for processing. A large majority of the files are
identical except for the time stamp. Below is an example of one of
the
.xml files.

What I am wondering is, if there ar 5 files that are the same
(except
for the time stamp) that have not been processed, would there be a
problem with deleting 4 of the files and only having the 1 remaining
file process. I am assuming that this would work because all of the
update information is in the databse not the .xml file.

What I want to eliminate is processing the same plan 5 time at, lets
say, 10 minutes for each .xml (50 minutes) and only process 1 for
for
10 minutes.

If this 1 file is processed, all updates that have been made to the
project will be processed correct?

- <ViewsDropData>
- <ProjectData>
- <EntProjectName>
- <![CDATA[ EMP435_21 IM MedPtD_BTI.Published
]]>
</EntProjectName>
<EntProjectID>305</EntProjectID>
</ProjectData>
<OrgGuid>projectserver</OrgGuid>
<ProjectID>336</ProjectID>
<BasePath>projectserver</BasePath>
<TimeStamp>2006-06-08T12:02:47</TimeStamp>
</ViewsDropData>

Thanks for your input.

Don
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top