This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Scope for TPAM Automation

Hi Team,

can someone Please advise if we can automate the below Tasks in TPAM.

we perform these tasks monthly once manually and we would like to know if we can automate them.

Adding systems to different password check and change schedules for scheduled password changes

Deleting the systems which are decommissioned 

Configuring monthly reports related to password change success/failure and send mail to the team.

Appreciate your response on this. Thank you. 

  • As ever with TPAM there are a number of ways to achieve the automation of these tasks.

    Probably the easiest way however is to use the CLI or API provided with TPAM

    Bash, powershell or even good old batch files can be run to automate tasks with the command line. 

    I use the PuTTy plink.exe command line to provide the SSH connectivity the CLI commands require.

    You have Perl, Java, and C# .net API's available.

    One thing to remember with CLI and all but the .net API is that you cannot open a persistent ssh connection. The sequence is:

         Script/API make ssh connection to TPAM

         CLI/API command is executed

         TPAM returns results 

         TPAM closes connection

    So when you write your code you need to take this into account and create loops accordingly.

    By design, There is no way to create a customer report or queries within TPAM.. Reporting requires CPU and memory resources that could impact on TPAM's ability to perform core tasks.

    You can send the standard TPAM reports via email but only on the pre-defined schedules. 

    Again you could write custom CLI or API code  to extract data and produce your reports. .This approach is ok where you need point in time  reports based on data in the solution but depeding on your log rotation my not give you the historic reporting you may need.

    In this case the TPAM data extracts can export on a schedule to populate an external SQL data base.

    Raw CSV data can be extracted every day and dumped in a share on an archive severer.

    From this share the data is imported into the DB using standard DB tools. . 

    You can now create what ever custom reports you wish and have the full capabilities of your chosen SQL DB available to create reports as well as run them on a schedule.  

    Tim