Quantcast
Channel: Symantec Connect: Data Loss Prevention (Vontu) Customer Group
Viewing all 179 articles
Browse latest View live

Solution Pack Import Completed with errors

$
0
0
ソリューションが必要です

Hi

 

This is a fresh install of DLP 11.6. After installing, I have not logged in to the Web Console yet.

 

When I try to import the Financial Solution Pack, I receive the following errors:

 

Nov 21, 2012 1:40:29 PM com.vontu.model.ojb.OJBSystem start
INFO: Opened database: oracle-thin; JDBC connection URL: jdbc:oracle:thin:@(description=(address=(host=***)(protocol=tcp)(port=***))
(connect_data=(sid=***)))
Nov 21, 2012 1:40:29 PM com.vontu.model.notification.rmi.RMINotificationModel start
INFO: Created listener proxy: bind address = 127.0.0.1, timeout = DEFAULT, server object host = 127.0.0.1
Nov 21, 2012 1:40:29 PM com.vontu.model.notification.rmi.ModelListenerProxyImpl
notificationActivated
INFO: Listener proxy activated; host: ***, unique ID: 10,741
Nov 21, 2012 1:40:29 PM com.vontu.model.Model createInstance
INFO: Created Model instance using: com.vontu.model.ojb.OjbAuthorizingModel
---------- Testing System Integrity ------------
--------------------- OK -----------------------

The Solution Pack install completed with errors. The database is in an inconsistent state.

Nov 21, 2012 1:40:30 PM com.vontu.model.notification.rmi.ModelListenerProxyImplnotificationDeactivated
INFO: Listener proxy deactivated; host: SV195631.ztb.icb.commerzbank.com, unique ID: 10,741
Nov 21, 2012 1:40:30 PM com.vontu.model.notification.rmi.RMINotificationModel stop
INFO: Listener proxy unexported
Nov 21, 2012 1:40:30 PM com.vontu.model.Model shutdown
INFO: Model instance released: com.vontu.model.ojb.OjbAuthorizingModel

 

Thanks


Installed Software Inventory in Altiris

$
0
0

Hi All,

As I am working on Altiris , I have faced some issues like to get list of software installed on each mavhine, I need to enter single m/c and fetch report by single computers but it should be in group. 

I think we should provide a list of comuter/ host name.txt and it should provide same details for all machines in single click. If not then there can be some helpful feature which does the same.

SEP virus infection resolution

$
0
0
ソリューションが必要です

Hi All, 

As I am new to this fiels can anyone provide me the resolution guides for frequently faced infected virus issues and steps to resolve completly without affecting the working of machine. I mean it should affect O.S files so that there is no impact / reparing required on functioning of O.S.

8058131
1355206194

DLP Custom Tiering

$
0
0
ソリューションが必要です

Hello. It is urgent and I need to know. Can I install Enforce Platform and Endpoint Prevent detection server on one server, and Oracle DB on another server (this is what I'm referring to as "custom tiering")? Meaning that only two servers are required.

Thanks!

 

- Moh

8083511
1355430876

Existing Oracle 11g License

$
0
0
ソリューションが必要です

Hi there. This time my customer would like to know whether they can utilize their existing license for Oracle 11g. Is this possible?

If so, does Oracle dictate that DLP DB must be installed on a 'central' database server (along with all other Oracle DB instances in that environment)? Or a new instance for DLP can be installed on its own dedicated server? (this looks more an Oracle question rather than Symantec, but I will appreciate your assistance)

Thanks in advance...

 

- Moh

Remove registry entries of uninstalled software on remote machines

$
0
0
ソリューションが必要です

Hi All,

Please help me regarding mentioned subject and requirement is as follows

I have uninstalled some symantec software using some uninstall script which uninstall the software but sometime keeps registry entries in registry DB. 

How can i remove/clean such entries of uninstalled software from registry on remote machine.I wanted to do this operation from machine to all remote machine using some script / tools.

Please provide me any software/ script/tool to get this task. 

8435691
1362234631

Remove registry entries of uninstalled software on remote machines

$
0
0
ソリューションが必要です

 

Hi All,
 
Please help me regarding mentioned subject and requirement is as follows
 
I have uninstalled some symantec software using some uninstall script which uninstall the software but sometime keeps registry entries in registry DB. 
 
How can i remove/clean such entries of uninstalled software from registry on remote machine.I wanted to do this operation from machine to all remote machine using some script / tools.
 
Please provide me any software/ script/tool to get this task. 
8366941
1362218264

complete registry cleaner after uninstallation

$
0
0

Dear All,

I am working on symantec Altiris and come to know some new requirement is sometime does not required but sometime it must be as Audit and compliance point of view.

My concern is there is no solution in altiris to clean complete registry entries of any uninstalled software. I expect this requirement may be considered in future.

 

Thanks & Regards

Kishorilal


Software packaging and converting exe/msi

$
0
0

Dear All,

As I have a software which has software package with 8-10 file which i need to convert/bundled in single .exe set up and then convert it into .msi so that i can use this .msi package to remote deployment or uninstallation of that software on diffrent remote machine. I think Altiris has lots of feature still the real life and practical need like this should also considered. I hope this may happen in future product.

 

Thanks

Kishorilal

Keystore passwords and TLS Certs

$
0
0
ソリューションが必要です

I am trying to setup Network Prevent for Email and there seems to be a disconnect in the documentation regarding some of the default keystore passwords.

Here are some simple steps to recreate the problem.

 

1. Generate a new keystore as you normally would after install (install guide pg 47 or admin guide pg 315).

This gives you a new enforce.*.sslKeyStore for the enforce server and a monitor.*.sslKeyStore file for your detection servers.  I've tried this both with a single cert for the detection boxes and using the aliases file to create individual ones for each server.  It doesn't matter which way you go.

 

2. On your Network Prevent for Email box,  following the steps in the MTA integration guide (pg 34) I go to change the password on the newly generated keystore.  Now technically you could leave this password the same,  assuming you know it,  but generally you want to change it.  You need to know it for step 3 and 4.

keytool -storepasswd -new <mynewkeystorepassword> -keystore c:\Vontu\Protect\keystore\monitor.blahblah.sslKeyStore -storepass dummypassword

This command now fails with "Keystore was tampered with, or password was incorrect".  I've tried:

dummypassword, protect, Protect, prevent, Prevent all without success.  What password is the sslkeytool chosing for these new keystores?

 

3. Now you would do the generate keypair step on pg 35.  It requires you to know the keystore password and make sure you certificates are using the same password (-storepass) for the key password (-keypass).  You can't run this command if you don't know the password from step 2,  either the one you changed it to,  or the default if you didn't.

keytool -genkeypair -dname "dname_string" -alias smtp_prevent -keypass key_password -keystore c:\Vontu\Protect\keystore\prevent.ks -storepass store_password -validity expiration_days

4.  Now you would import the public certs from your upstream/downstream MTA's depending upon your config.  Again without the keystore password you can't do this.

 

So in the end you essentially can not use Network Prevent for Email with TLS appropriately configured if you are using anything other then the prevent.ks default keystore.  Now what company who is serious about security is going to use the default keystore where everyone else has that private key and can decrypt data protected by it.

Can someone help me out here?  Is there an undocumented default password that sslkeytool is chosing other then dummypassword?

 

Thanks!

Connection String for SQL DB Scan - Unable to retrieve the list of table names

$
0
0
ソリューションが必要です

We are trying to find the correct Connection String to scan our SQL databases.  I've read the previous discussions and tried all the connection string I could find to no avail.

Please take a look at this and let me know any suggesstions for other things to try.  Also, are there specific database permissions we should set up.

We are getting the error:

   "Failed to read sqlserver://hpsqld04.mycompany.com:1433/SecAdmin_Test;instance=mssqlserver; error: Unable to retrieve the list of table names: null"

Under the Scanned Content Tab, Scan Database Servers:  Connection Strings we've tried (all get the same error):

sqlserver://hpsqld04.mycompany.com:1433/SecAdmin_Test;instance=mssqlserver

    sqlserver://hpsqld04.mycompany.com

sqlserver://hpsqld04.mycompany.com:1433/SecAdmin_Test

Use These Credentials:   UserName:  mycompany\svcDLPuser -- database permission System Administrator

 

What's New in IT Analytics Symantec Data Loss Prevention 3.0

$
0
0

Building on the success of the previous version of IT Analytics for Symantec Data Loss Prevention and incorporating some fantastic user feedback, Symantec has just released version 3.0 of the reporting content pack. For existing IT Analytics customers, the new version of IT Analytics for Symantec Data Loss Prevention is now available for upgrade through the Symantec Installation Manager. Some of the highlights within the new version include:

New Cube: Incident Status History

This new cube contains historical information about incident status changes within the Data Loss Prevention system, including details about who performed the change and when. Information specific to this cube includes the total number of incident actions, change date, user name, and more.
 

Cube Updates

All cubes have been updated to be more consistent with DLP nomenclature and several cubes have been updated with additional dimensions and measures to provide greater options in reporting. Additionally, the DLP Discover Scans cube has been updated to support all scan types. For cube definitions, including the list of available measures and dimensions, please see the official IT Analytics for Symantec Data Loss Prevention 3.0 User Guide.
 

New Reports

Dozens of new out-of-the-box reports were added to the new release including the following list below. Report subscriptions can be enabled for all of these reports so that they can be received via email on a reoccurring basis. For definitions of each report, please see the official IT Analytics for Symantec Data Loss Prevention 3.0 User Guide

  • DLP Auditing – User Action Auditing
  • DLP Auditing – User Event Details
  • DLP Auditing – User Incident Event Summary
  • DLP Deployment – Agent Search
  • DLP Deployment – Agent Version by Server
  • DLP Deployment – Policy Evolution Trend
  • DLP Deployment – Scan Summary
  • DLP Investigations – Discover File Incidents by File Owner Trend
  • DLP Investigations – Networking File Incidents by Networking User Trend
  • DLP Investigations – User Incident Details
  • DLP Investigations – User Incident Search
  • DLP Normalized Risk – Frequency of Discover Incidents vs. Files Scanned Trend
  • DLP Normalized Risk – Frequency of Discover Incidents vs. GB Scanned Trend
  • DLP Normalized Risk – Frequency of Email Incidents (Email Prevent)
  • DLP Normalized Risk – Frequency of Web Incidents
  • DLP Policy Optimization - Policy Change Audit
  • DLP Policy Optimization – Policy Change Impact
  • DLP Policy Optimization – Policy Change Trend
  • DLP Policy Optimization – Policy Changes
  • DLP Remediation – Discover Incident Details
  • DLP Remediation – Discover Incident Search
  • DLP Remediation – Endpoint Incident Details
  • DLP Remediation – Endpoint Incident Search
  • DLP Remediation – Incidents Search
  • DLP Remediation - Incident Status History Details
  • DLP Remediation – Network Incident Details
  • DLP Remediation – Network Incident Search
  • DLP Remediation – Remediator Productivity
  • DLP Statistics – Discover Scanned File Trend
  • DLP Statistics – Discover Scanned Storage Trend
  • DLP Statistics – Endpoint Incident Trend by Channel
  • DLP Statistics – Organizational Incident Trend
  • DLP Statistics – Incidents by Policy
  • DLP Statistics – Incidents by Product Area
  • DLP Statistics – Incidents by Severity
  • DLP Statistics – Incidents by Status
  • DLP Statistics – Incident Trend by Product Area
  • DLP Statistics – Scans
  • DLP System Management – Agent Summary by Status
  • DLP System Management – Agent Summary by Version
 

Processing Performance

Cube processing performance has been greatly improved and optimized to provide shorter processing times on average. NOTE: The processing time varies depending on the amount of data to be included in the cubes and the server hardware specifications present in your environment.

Download and install the new version today and gain greater flexibility and insight into your Symantec Data Loss Prevention reporting!

SDLP integration with CITRIX ShareFile

$
0
0
ソリューションが必要です

Has anyone integrated DLP into monitoring their CITRIX ShareFile (on-premise)? If so, I would like to know more details on the "what and how." I am looking for configuring, performance and tips.  Thank you.

I am presuming we could DIM monitor/intercept ShareFile HTTP/HTTPS outbound to the Cloud using our current BlueCoats and Web Prevents with no new adds/work. OR, for that sceanrio we could get alert only using the edge Network Monitors.  But:

  • How do you integrate SDLP and Sharefile for an on-premise ShareFile instance? Please give details.
  • Is "alert only" mode more appropriate for the traffic going to ShareFile? Please expound and elucidate with your rationale.
    • And if so, Why not simply set up and use a Network Monitor for alerts?  And, if that is in fact what you’ve done, how did you configure it?  Again, details are most helpful.

Additionally, I could also see scheduling and doing DAR/Discover scans on the logs of CITRIX ShareFile.  Is anyone doing that?  And if so, is it in addition to DIM/WP? Please expound and elucidate.

 

Thank you,

 

DLP Flex Response error message

$
0
0
ソリューションが必要です

Dear all:

When user NT2486 press the Flex Response button, it will show the png file;"An unexpected error has occurred. This could be due to one of the following: 1) Your session timed out and you selected a link that was no longer valid, 2) You used the browser back or forward button placing the system into an inconsistent state, or 3) The system experienced a temporary problem", but the incident shows normal process.

Did anyone meet this situation before? can resolve this problem?

 

Thank you

Chris

Upgrading to IT Analytics for Symantec Data Loss Prevention 3.0

$
0
0

Current users of IT Analytics for Symantec Data Loss Prevention 2.0 can now upgrade their installation to the new 3.0 version recently released by Symantec and gain significant benefits in both reporting and performance. This article outlines the process of upgrading from IT Analytics for Symantec Data Loss Prevention version 2.0 to version 3.0 in a simple, step-by-step format.

Upgrade Checklist

Before you perform the upgrade in your environment, consider the following:

  • This article assumes you will be upgrading on the same server. If you are moving to another server and installing IT Analytics for Symantec Data Loss Prevention 3.0 at the same time, consider the following article on migrating an IT Analytics installation.
  • Ensure the version of the Symantec Management Platform you are running is at least 7.1 SP2. If it is a prior version, you will need to upgrade the Symantec Management Platform before upgrading IT Analytics.
  • Perform a backup of the server hosting the Symantec Management Platform and IT Analytics Data Loss Prevention, using the backup tool of your choice.
  • Perform a backup of the CMDB database and the IT Analytics database in SQL Analysis Services (if SQL is hosted off-box). For more information about how to back up the CMDB database, see the following knowledge base article.
  • This article assumes you have administrator access to the Symantec Management Console.
  • Record the following configuration settings in the Symantec Management Console, in the event you might require to configure similar settings after the upgrade:
 DLP IT Analytics connection settings to Analysis Services and Reporting Services under: Settings > Notification Server > IT Analytics Settings > Configuration
                      Connection settings to the DLP database under: Settings > Notification Server > IT Analytics Settings > Connections > Symantec Data Loss Prevention
 Processing schedules under: Settings > Notification Server > IT Analytics Settings > Processing

CAUTION: When you initiate the upgrade of IT Analytics from 2.0 to 3.0, the existing cubes, and reports are uninstalled due to the change in schema between versions. The new out-of-the-box reports and cubes must be reinstalled once the upgrade has completed. If you have customized any of the out-of-the-box cubes and reports in version 2.0, you must reapply those changes after upgrading to the 3.0 version. Any net new reports or cubes that were created in the previous version are not affected by the upgrade, however because of schema changes with the new version, they may not work as expected. If you have not modified the existing cubes or reports and have not developed any new cubes or reports, there are no additional steps beyond what is listed below.

 

Starting the Upgrade Process

Follow the steps below to upgrade to IT Analytics for Data Loss Prevention 3.0: 
  1. Open the Symantec Installation Manager by clicking: Start > All Programs > Symantec > Symantec Installation Manager, and allow the application to load.
  2. On the Installed Products screen, should see at least one product available for upgrade.

NOTE: Clicking on 'Upgrading installed products' will allow you to upgrade to the latest version of IT Analytics for Symantec Data Loss prevention, however this may also include other product upgrades or Symantec Management Platform maintenance packs along with it. For the purposes of this article, we will use a method to upgrade only the IT Analytics for Symantec Data Loss Prevention version, as described below.

  1. Click on the Install New Products link at the top and on that screen, change the filter from Suites to Solutions.

  1. Scroll down the list, and check the Symantec IT Analytics Data Loss Prevention Pack 3.0, or simply search for 'analytics' in the upper right to do a quick find.

  1. Click Next.
  2. Optional - On the Optional Installations page, select the Language Packs for installation and then click Next.
  3. On the End User License Agreement page, verify that the correct products were selected, check 'I accept the terms in the license agreements,' and then click Next.
  4. Verify that your contact information has not changed and then click Next.
  5. On the Review Installation Details page, verify that Symantec IT Analytics Data Loss Prevention Pack 3.0 is listed.
  6. Click Begin install to start the download and installation process.
  7. If you are prompted to backup Notification Server cryptographic keys click Skip. This step is not necessary for upgrading to IT Analytics for Data Loss Prevention 3.0.
  8. Verify the Installation Complete screen is displayed and click Finish.
  9. On the resulting Installed Products screen, verify that the version for IT Analytics for Data Loss Prevention is now listed as 3.0.

 

Reinstalling Cubes and Reports

Once the upgrade completes, you need to reinstall the cubes and the reports that are included in IT Analytics for Data Loss Prevention version 3.0.

Reinstalling Cubes

  1. In the Symantec Management Console, on the Settings menu, click Notification Server > IT Analytics Settings.
  2. In the left pane, expand the Cubes folders.
  3. In the Cubes page, click the Available tab.
  4. Check all the cubes that you want to install. To install all of the available cubes, in the header row of the table, click Install.

article28-5_0.png

  1. Click Save Changes.
  2. At the prompt, click OK to proceed with the installation.
  3. IT Analytics Event Viewer window displays the progress of each cube that was selected. Click Close when the process has completed.

article28-6_0.png

  1. Verify that the cubes were successfully created by clicking the Installed tab, and then review the list of cubes. 

 

Reinstalling Reports

  1. In the left pane, expand the Reports folders.
  2. In the Report Setup window, click the Available tab.
  3. Check all the reports that you want to install. To install all of the available reports, in the header row of the table, click Install.

article28-7_0.png

  1. Click Save Changes.
  2. At the prompt, click OK to proceed with the installation.
  3. IT Analytics Event Viewer window displays the progress of each report that was selected. Click Close when the process has completed.

article28-8_0_0_0.png

  1. Verify that the reports were successfully installed by clicking the Installed tab, and then review the list of reports.

 

Reconfiguring the Cube Processing Tasks

You can create and assign processing schedules for all installed cubes. Your business needs to dictate how often the cubes should be processed. For a typical configuration, all cubes should be processed daily. This task is essential for IT Analytics to function properly because the cubes do not contain any data until the cube processing is complete.

Note: If you had previously created cube processing tasks in the 2.0 version, those tasks should still be available after the upgrade, but because the cubes were uninstalled and reinstalled, you will have to reassociate the specific cubes with the apprpriate processing tasks. Also, keep in mind that the new Incident Status History Cube will have to be assigned to a processing task.

To reconfigure the cube processing tasks:

  1. In the Symantec Management Console, on the Settings menu, click Notification Server > IT Analytics Settings.
  2. In the left pane, expand the Processing folders. You should see that all cubes require processing.

article28-9_0.png

  1. If only using the default processing task, select the schedule that you want and then check the Enabled box. Symantec recommends processing cubes no more than once a day, depending on the number of cubes and amount of data in your environment. If you are using previously configured processing tasks, check that the schedules are in line with expectations.
  2. Check the box for each available cube that you want to be processed on the current schedule. For a typical configuration select all cubes, however depending on the amount of data in your Oracle DLP database, you may need to create mulitple processing tasks for optimum performance.
  3. Click Save Changes and confirm that the processing task is saved.
  4. You can either wait until the scheduled processing time, or click Run Now. The selected processing tasks start asynchronously, which means that the task does not finish by the time that the page refreshes. This task can take several minutes to execute. The execution time depends on the number of the cubes that are selected and the size of data within the database. You can monitor its progress by viewing the events in the IT Analytics Event Viewer window while the manual processing task executes. 

article28-10_0.png

  1. After the processing trace has completed, click Close and you should notice that all of the cubes have now processed.

article28-13_0.png

 

Verifying Your Upgrade

After cube processing completes, you can verify your upgrade and ensure that all of your configuration steps complete successfully.

To verify your upgrade:

  1. In the Symantec Management Console, on the Reports menu, click All Reports.
  2. In the left pane, under IT Analytics, expand the Cubes folder and then click on the new Incident Status History cube.
  3. From the pivot table field list, drag in Status Changes and Incident - Product Area to create a quick cube view and ensure you are getting data. This will indicate that both the upgrade and cube process completed successfully.

article28-11_0.png

  1. In the left pane, under IT Analytics, expand the Reports folder and then click on the new DLP Remediation - Incident Search report. You should also see a much longer list of reports than was there previously. Once this report loads, it will indicate that the new reports from the upgrade were installed successfully.

article28-12_0.png


Upgrading to IT Analytics for Symantec Data Loss Prevention 3.0

$
0
0

Current users of IT Analytics for Symantec Data Loss Prevention 2.0 can now upgrade their installation to the new 3.0 version recently released by Symantec and gain significant benefits in both reporting and performance. This video outlines the process of upgrading from IT Analytics for Symantec Data Loss Prevention version 2.0 to version 3.0 in a simple, step-by-step format.

ビデオのアップロード: 
ベンダー固有の設定
2683226300001
Public

IT Analytics for Symantec Data Loss Prevention 3.0 - Cube Processing Recommendations

$
0
0

Cube Process Scheduling Recommendations

IT Analytics for Symantec Data Loss Prevention 3.0 extracts data from the Oracle DLP Enforce database(s) on a scheduled basis. The extracted data is then stored in multi-dimentional cubes within the Microsoft Analysis Services database, that once processed, act as the data sources for the reports and dashboards in IT Analytics.

The frequency of the cube processing schedules will determine how current the data in the cube is. Depending on business requirements, this frequency may vary, but the general recommendation for cube processing is once a day for some cubes and weekly for others (as described below). Note that there are several variables that affect the duration of cube processing tasks but the two major factors are:

  1. Hardware specifications of the SQL Server hosting Analysis Services
  2. Amount of data being processed (i.e. overall size of the Oracle DLP database)

The lower the hardware specifications of the SQL server and the greater amount of data to process, the more time it will take and vice versa. To optimize cube processing performance, it is recommended that you create two separate tasks that will process cubes on two different schedules, per the list grouping below:

Group 1 Cubes (Process Daily)Group 2 Cubes (Process Weekly)
DLP Incident Summary CubeDLP Incident Details Cube
DLP Discover Incident Summary CubeDLP Discover Incident Details Cube
DLP Endpoint Incident Summary CubeDLP Endpoint Incident Details Cube
DLP Network Incident Summary CubeDLP Network Incident Details Cube
DLP Agent Status CubeDLP Policy History Cube
 DLP Incident Status History Cube
 DLP Discover Scans Cube
 DLP Incident History Cube
 DLP User Action Audit Cube
 DLP Network Statistics Cube

The first task will include all the DLP summary cubes and be processed daily. This should provide enough information on a daily basis to give end users the visibility they need into their DLP environment. The second process includes the more detailed and historical cubes which only need to be processed weekly. This orientation helps to expedite cube processing and ensure the right data is available for end users. 

 

Cube Processing Benchmarks (General Estimates)

Your business requirements may stipulate that data must be updated daily, as such all cubes may need to be processed each day. Using the cube groupings outlined above, you can run these tasks sequentially on a daily basis, however be careful to allow enough time for the first task to finish before the next one begins. Again, depending on hardware resources and amount of data in the DLP database, this will take some trial and error to optimize completely. To help you start this task, the tables below provide administrators some general benchmarking estimates for cube processing (based on environment size and hardware specifications) in order to determine the approximate times necessary for your environment. 

NOTE: The processing intervals listed below are estimates ONLY. Your times will vary based on the hardware specifications and amount of data in your environment. These times are offered as general guidelines only.

 
Incident Count
Small
Medium
Large
Endpoint Incidents

5,000

10,000

4,000,000

Network Incidents

40,000

500,000

4,000,000

Discover Incidents

10,000

50,000

1,000,000

 
Hardware Component
Small
Medium
Large

Hardware Type

Virtual

Virtual

Physical

Processors

Quad Core

Eight Core

64 Core

RAM

8GB

8GB

256GB

The table below provides guidance on the impact the SQL Server hardware (as defined above) has on the time it takes to process a given cube.

IT Analytics DLP Cubes

Processing Times per SQL Hardware Options

Small

Medium

Large

DLP Administrative Events Cube

10s

10min

30min

DLP Scans Cube

30s

5min

30min

DLP Agent Status Cube

20s

20s

1hr

DLP Network Incident Summary Cube

3mins

30min

2hrs

DLP Discover Incident Summary Cube

4min

5min

3hrs

DLP Endpoint Incident Summary Cube

3min

1min

3hrs

DLP Incident Summary Cube

3min

30min

3.5hrs

DLP Incident Status History Cube

30min

2hr

4.5hrs

DLP Messages

5s

1hr

3hrs

DLP Network Incident Details Cube

3min

1hr

5hrs

DLP Discover Incident Details Cube

4min

5min

5hrs

DLP Endpoint Incident Details Cube

3min

1min

5hrs

DLP Incident Details Cube

3min

1hr

5hrs

DLP Incident History

3min

1hr

5hrs

DLP Policy History Cube

1min

45min

4hrs

 

Upgrading 11.6 to 12.0

$
0
0
ソリューションが必要です

Hi All

We are planning to upgrade DLP 11.6 to 12.0 and wanted to know if there are Any features removed in DLP 12.0 from 11.6.

I've checked in Whats New guide and Release notes but don't see any removal of feature as such so thought of asking DLP experts. Thanks

 

 

Need admin guide and install guide for Sym DLP

$
0
0
ソリューションが必要です

Hi,

 

I need the Admin guide and Install guide for both versions 12.0 and 11.6.3???

Can somebody help

1380070379

IT Analytics for Symantec Data Loss Prevention - Glossary of Terms

$
0
0

IT Analytics introduces powerful ad-hoc reporting and business intelligence tools, and along with it a few terms that may be new to you. To alleviate any confusion, this article describes a few key terms so that you can easily understand out-of-the-box functionality and start using the tool to gain deeper insight into your DLP data to make informed decisions.

TermDefinition
MeasureMeasures are the aggregate count, or how you quantify results when creating a pivot table view. These typically make up the columns in your report. Every view you create must contain at least one measure. (For example: Incidents Count)
DimensionDimensions are a grouping of specific data types you are quantifying when you create a pivot table view. These typically make up the rows in your report, but dimensions can be used across columns or as filters. Every view you create must contain at least one dimension. If you have more than one dimension, you can drill in and out or change the order of dimensions to arrange the report the way you want it. Please see the Connect article for a list of all dimensions in IT Analytics.
AttributeAn attribute is a sub-grouping of data types for a specific dimension. A dimension may have one or more attributes and these can be used like any other dimension. (For example: Policy - Description, Policy - Status, Policy - Name, Policy - ID). Please see the Connect article for a list of all dimension attributes in IT Analytics.
Key Performance Indicator (KPI)
Quantifiable measures that represent a critical success factor in an organization. The emphasis is on the action of quantifying something in the environment. The KPIs must be measurable to successfully be monitored and compared against a given objective. (For example: Number of Alerts in the Last 30 Days). Please see the Connect article for creating a key performance indicator in IT Analytics
CubeMultidimensional data structures (as opposed to a relational database) that store precompiled information from the DLP Oracle database(s). Cubes contain measures and dimensions that are arranged in a specific way for common reporting purposes. These are the underlying source for all reporting in IT Analytics and are stored in the Analysis Services of SQL Server. Please see the Connect article for a list of all cubes in IT Analytics.
Report or DashboardPre-developed reports that are hosted by the Reporting Services component of SQL Server. Several out-of-the-box reports and dashboards are available upon install and you have the flexibility to create your own through Report Builder.
SQL Analysis ServicesThe free component of SQL Server that hosts and processes all cubes within IT Analytics. This component is required to install IT Analytics. Please see the Connect article for configuring Analysis Services and installing IT Analytics
SQL Reporting ServicesThe free component of SQL Server that hosts all reports and dashboards within IT Analytics. This component is required to install IT Analytics. Please see the Connect article for configuring Reporting Services and installing IT Analytics
Report Builder
Report Builder is a client-side application (developed by Microsoft and free with Reporting Services) that you can use to create and design reports. Using Report Builder, you can design reports that are based on your data from within IT Analytics, without having to understand the underlying schema or complex programming languages. Please see the Connect article on creating custom reports in Report Builder.
Pivot TableAn arrangement of measures and dimensions from a specific cube in tabular form, with the goal of creating an ad-hoc report. Please see the Connect article on working with pivot tables in IT Analytics
Pivot ChartAn arrangement of measures and dimensions from a specific cube in chart format, with the goal of creating a visually informative report. Please see the Connect article on working with pivot tables in IT Analytics
Content Pack

A software component that bundles cubes, reports and dashboards specific to a particular Symantec solution suite. IT Analytics content packs are currently available for:

  • IT Management Suite (Altiris)
  • Symantec Endpoint Protection
  • Data Loss Prevention
  • Critical System Protection
  • ServiceDesk
ParameterTypically a dimension attribute used to filter data within an IT Analytics report or dashboard. This technique is used within Report Builder when creating reports.
Processing ScheduleThe given frequency that data will be purged and then recompiled within the IT Analytics cubes. Typically this is done once a day, but depending on environment, server resources and business requirements, this can be set to process more frequently. This schedule is set within the configuration page of IT Analytics, but the processing itself occurs within SQL Analysis Services.
Symantec Management PlatformThis application hosts the IT Analytics configuration and reporting interface. It is required to install IT Analytics. Please see the Connect article on installing the Symantec Management Platform
Symantec Installation ManagerThis application allows you to download, install and update software hosted by the Symantec Management Platform, including IT Analytics. To install the Symantec Installation Manager, please download the IT Management Suite trialware from Symantec's trialware site

 

Viewing all 179 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>