/build/static/layout/Breadcrumb_cap_w.png

Custom Inventory rule

This string works, but the report is huge, I need to summarize it.  What I need is a total of all office type documents that are stored on each PC in my single organization.  The string I am now using is shown below, this works, but it lists every PC and along with the path and the name of the .doc files.  It is simply too big.  What we are trying to do is figure out how much storage we will need if we want to move all of these documents to a server.  I need to know just the count of office docs on each machine and the total size on each. 

Thanks in advance.

 

 

ShellCommandTextReturn(c:\windows\system32\wbem\WMIC.exe datafile WHERE "drive='c:' AND path like '\\users\\%%' AND NOT path like '%%AppDat%%' AND Extension='doc'" get name) 


1 Comment   [ + ] Show comment
  • I would recommend developing a script that retrieves the data you are looking for and outputs that to a file. The custom inventory rule would then load the data from the file. - chucksteel 10 years ago

Answers (2)

Answer Summary:
Posted by: SMal.tmcc 10 years ago
Red Belt
1

if you want to count certain directories like "my documents" and "desktop" you can use powershell like

ShellCommandTextReturn(powershell  -command "ls -path C:\users\*\Documents,C:\users\*\Desktop -r|measure -s Length")

or

ShellCommandTextReturn(powershell  -command "ls -path %USERPROFILE%\Documents,%USERPROFILE%\Desktop -r|measure -s Length")

this will get a total for all files in the directories.  maybe some else knows the powershell command to add an extension filter to the search

ShellCommandTextReturn(

Comments:
  • my tier 3 googling skills pay off again.
    the commands to look for docx or xlsx file in users areas of documents or desktop

    you could even use -i *.doc*,*.xls*

    powershell -command "ls -path C:\users\*\Documents,C:\users\*\Desktop -i *.docx,*.xlsx -r|measure -s Length"

    powershell -command "ls -path %USERPROFILE%\Documents,%USERPROFILE%\Desktop -i *.docx,*.xlsx -r|measure -s Length" - SMal.tmcc 10 years ago
    • Very nice. Thank you - wd_bs 10 years ago
      • the ShellCommandTextReturn(powershell -command "ls -path c:\users\*\Documents,c:\users\*\Desktop -i *.docx,*.xlsx -r|measure -s Length")
        works

        still debugging the one with the %userprofile% wildcard in it. it works as a ps call in a batch file but have not figured out the k1000 verbage yet - SMal.tmcc 10 years ago
      • by what I read you cannot use env variables in custom software inv's - SMal.tmcc 10 years ago
      • Also, the custom inventory rule runs in the system context so the %userprofile% variable wouldn't get you results for the current logged in user anyway. - chucksteel 10 years ago
  • Now I am trying to convert bytes to megabytes and it returns a value of 0. I addedd .sum /1mb to the end of that statement. Once I get that figured out I need to get a total in Gigabytes at the top page of the report, not sure how to go about it. - wd_bs 10 years ago
Posted by: StockTrader 10 years ago
Red Belt
0

The best solution is to write a simple powershell script (or the scripting language that you prefer) to be run on all the computers where you need this data.

The output can be a text file or some values in a specific regitry key and then use a Custom Inventory Rule to get this data.

So to recap

  1. Write a script that collects this data and stores the results in the registry or a text file in a specific location
  2. Schedule this script using the scripting module of K1000
  3. Create a Custom Inventory Field where you will retreive the output of the script from the registry or text file.

If you want some ideas about how to write the PowerShell script let us know :-)

 Kind regards

StockTrader (Marco)
 

 
This website uses cookies. By continuing to use this site and/or clicking the "Accept" button you are providing consent Quest Software and its affiliates do NOT sell the Personal Data you provide to us either when you register on our websites or when you do business with us. For more information about our Privacy Policy and our data protection efforts, please visit GDPR-HQ