• open panel
  • Home
  • Posts Tagged'mft'

Posts Tagged ‘mft’

Managed File Transfer is for SMB’s too

Recent research suggests that the SMB sector is coming under an increasing number of cyber attacks from hackers and cyber criminals.  Figures published by Symantec revealed that the number of attacks on companies with fewer than 250 staff had doubled in the six months to June 2012.  Similarly AVG reported that it was predicting an increase on the £3.37 million of damage inflicted on UK SMB’s last year.  Verizon confirmed that the majority of 855 data breaches analysed in their Data Breach Investigation Report had been inflicted upon SMB’s.

The question is; why are SMB’s being targeted?  It would appear that the modest budgets available to small companies make them easier targets, Small Businessgiventhe lower level of expenditure on information security technologies.  Especially since SMB’s frequently work as suppliers for larger organisations, making them a more attractive proposition to hackers than the more conventional direct attack on the corporate target.

What does this have to do with managed file transfer I hear you ask?  MFT has generally been considered a technology more appropriate to corporate organisation’s – with its big price tag and grand title.  Times are changing for the MFT marketplace and there are now some very comprehensive solutions available at really competitive prices.  For those SMB’s considering how they should secure data transfers with their larger corporate customers, technologies exist at around the £4,000 mark which provide an equal amount of functionality as many corporate companies have.

To discuss your file transfer requirements, whether you’re an SMB or multi-national organisation, get in touch with Pro2col or give us a call on 0333 123 1240.

Share on TwitterShare on FacebookShare on LinkedIn+1
 

Ad Hoc Managed File Transfer – Microsoft Outlook, Lotus Notes but nothing for Apple Mac?

It would be fair to say that a clear majority of the IT professionals we speak to on a weekly basis want to regain some control over the data that their employees send.  In the past, many IT administrators would start with the customary approach of limiting the size of email attachments that could be sent via the email server. However, this only resulted in employees searching for alternative ways of sending files – frequently the choice was a SaaS/Cloud based, consumer grade solutions such as YouSendit, Mailbigfile, Dropbox, the list goes on!

In general employees intentions are honourable but when they need to ‘get the job done’, they adopt for the path of least resistance i.e quick and easy solutions that they can sign up to for free.  This however creates a minefield for the IT and Compliance departments who tackle a daily myriad of issues around tracking, governance and security of corporate data. With no online service completely immune from attack (take the recent announcements around LinkedIn and Yahoo) the issue is very real and pressing.

Inevitably the bottom line is that the business needs to invest in an ad hoc file transfer solution that enables employees to send files on an ad hoc basis.  Fortunately for businesses this issue has existed for some time, and a number of the managed file transfer vendors we work with offer solutions to these problems.  Selecting the right solution depends upon the number of users, security requirements, size of the data sets, preferred deployment option (on-site/hosted/SaaS), specific functionality required and budget – but that’s where we come in to help you through the vendor selection process.

Finally let’s not forget that there are a wide variety of email clients used in today’s businesses.  The most dominant – Microsoft Outlook has been well addressed by the majority of managed file transfer vendors.  Lotus Notes on the other hand hasn’t received the same time and attention but there are a couple of options out there, I guess this is down to size of their shrinking market share.  When it comes to Apple however everyone falls short.  This has little to do with the developers ability – more with the architecture of Apple Mail, Microsoft Entourage or Outlook for Mac, none of which support plugin’s in the same way.  Still, all of the vendors provide a webmail facility if you need to support Mac OS users.

For further assistance in selecting the most appropriate ad hoc file transfer solution for your business please get in touch with the UK’s leading MFT experts on 0333 123 1240.

Share on TwitterShare on FacebookShare on LinkedIn+1
 

Dropbox or MFT – that is the question…

Once again the spotlight has been turned on Dropbox following their recent investigation into irregularities in their service.  This latest occurrence appears as though it will conclude with Dropbox announcing that a number of their European users email address or account details have been compromised, we’ll watch with anticipation for the outcome.

Whilst cloud solutions certainly offer users with a raft of features currently not offered by other proprietary software vendors, they tend to lend themselves to being targets for hackers due to their high profile and wide adoption.  It begs the question, “Should consumer grade cloud based technologies be allowed within the enterprise at all?”Dropbox Technologies

Managed file transfer vendors are fast catching up with the cloud based solutions, adding Dropbox like features to provide users with the simple way of working to which they’ve become accustomed.   Whilst also adding new features such as mobile file sharing capabilities to cover off the BYOD angle, never before have MFT vendors being trying so hard to keep up with the contemporary features that users are demanding.

There are many benefits associated with implementing an in-house managed file transfer solution.  Possibly one of the most important in terms of security (taking into consideration the above) is the fact that they don’t become a high profile, data centre target.  Taking the understated, in-house deployment may well help you to slip under the radar whilst also providing a range of security features.

If your company is using Dropbox to share data then Pro2col can help.  We’ve worked with various companies across a range of industry sectors to replace consumer grade file sharing technologies.  Call one of our sales consultants on 0333 123 1240.

Share on TwitterShare on FacebookShare on LinkedIn+1
 

Hosted v’s On Premise Managed File Transfer

Over the past couple of years there has been considerable hype around saas, hosted or cloud based solutions – the managed file transfer marketplace has been no different.  We speak to many businesses on a daily basis about their file transfer requirements and inevitably, a number of them ask for a cloud based solution. So we’ve been speaking to a range of our vendors, natural leaders in the software field, but many of them seem unwilling to step out of their traditional marketplace and into the cloud space.  There are of course good reasons for this.  For example, the impact on existing software sales, the responsibility that goes with managing other peoples data and probably most importantly, the size of the market. According to Gartner only 10% of the managed file transfer marketplace actually relies upon a cloud based solution.  With the managed file transfer sector experiencing +20% growth year on year and the shift to cloud solutions not likely to slow any time soon, transition to cloud services could become the next major battlefield for vendors. However as it stands, this certainly isn’t reflected in the managed file transfer marketplace as there are many more software vendors than service providers.

Laptops in the Cloud

If you’re thinking about implementing a hosted or on-premise managed file transfer solution, there are a number key points to consider.  Here are a few to start you off:

Cost of Ownership

Cost is the number one factor influencing the choice of managed file transfer solution for most businesses.  Implementing and managing an on-premise MFT solution can be pricey, some of the costs to consider are:

Software – The initial software purchase price can range from £5,000 to in excess of £50,000 but once this has been paid, the solution is yours.  Although hosted services appear cheaper in the short term, ongoing subscriptions can be costly in the long term.

Hardware – Providing an environment to install and run the MFT solution can also add up, especially when you take into account disaster recovery or high availability.

Infrastructure – Hosting files on your own server can prove a problem, bear in mind the impact on your Internet connection when a 1Gb file is shared with 50 or more external users!

Support & Management

Another point to take into account after the initial go-live of a managed file transfer solution are the costs associated with ongoing support and management. Specifically:

Internal Support – If you deploy an MFT solution, the responsibility of support and management falls on internal team members, whereas with a hosted solution much of the ongoing support is outsourced to the service provider.

Availability – Generally hosted services run in high availability data centers.  Therefore, they offer guaranteed uptime with load-balanced solutions as the norm and include SLA’s.

Scalability – In a hosted environment, scalability of your product is generally available on demand or at the touch of a button.  It’s not always quite so simple with an on-premise MFT solution.

Back-up/disaster recovery – Usually provided as part of the service by hosting providers, back up and disaster recovery can be costly when purchased as part of a solution package.

Deployment

Bringing an on-premise managed file transfer solution online has its challenges. Ports need opening on firewalls, rules need setting up, plus there are considerations about the design of the solution and how it will sit within the corporate infrastructure.  Hosted solutions are incredibly fast to deploy given that much of the above doesn’t come into play.

Functionality & workflow

Typically, functionality and workflow features are key drivers in the decision making process. On-premise managed file transfer solutions offer far more in the way of a comprehensive feature set:

Bottlenecks – Having files local to you, when you need them can save a considerable amount of time in terms of loss of productivity, especially for larger enterprise deployments.  Pushing data to remote services can impact upon an end users time or can delay internal processes.

Integration – More often than not, hosted solutions are somewhat limited in the level of integration capabilities they offer.  This is an important factor to take into consideration as lack of integration reduces the potential to automate tasks and minimise the man hours wasted on routine tasks.  There are of course exceptions to the rule as there are some very capable hosted technologies.

These are just some of the key areas to be mindful of if you’re weighing up the pros and cons of hosted vs on premise managed file transfer solutions.  Ultimately, the decision will be fueled by the scope of your requirements and the size of your budget.  Whatever these may be, there is a managed file transfer solution out there to fit your business needs.  If you want some help pinpointing the right solution for you, please contact Pro2col on 0333 123 1240.

Share on TwitterShare on FacebookShare on LinkedIn+1
 

Planning Considerations for Managed File Transfer in 2011

If you’re considering implementing a managed file transfer solution during 2011 then you’re in luck.  Pro2col has been lobbying Ipswitch’s marketing team to provide a web cast specifically for the UK marketplace to address the considerations that should be taken into account when planning the implementation of a managed file transfer solution.

We’re very pleased to announce that Jonathan Lampe, Vice President of Product Management at Ipswitch File Transfer will be hosting a webinar titled, “Learn how and why MFT should fit into your IT structure, as well as insights into top security, compliance and data breach concerns.”  Jonathan will also be discussing;

  • Smart steps to lower your data breach risk
  • Tips on the “person-to-person” aspect of Managed File Transfer
  • How to evaluate, select, and implement a MFT solution with rapid time-to-value and a positive ROI
  • Strategic approach to MFT project planning

The webinar will take place on Wednesday 26th January at 3.00 pm.

REGISTRATION FOR THIS EVENT IS NOW CLOSED.

Share on TwitterShare on FacebookShare on LinkedIn+1
 

Data: Transferring the Burden Under PCI DSS

GT News have just published a great article written by Jonathan Lampe (Vice President of Product Management at Ipswitch) regarding data transfer requirements under PCI DSS.  If anyone is looking for a PCI DSS compliant solution for file transferring data, these are the points they really need to be taking into consideration:

Data: Transferring the Burden Under PCI DSS

Jonathan Lampe, Ipswitch – 08 Jun 2010

Despite widespread adoption of Simple Object Access Protocol (SOAP) and transaction sets in the financial industry, a surprising high percentage of the data flow is still represented by files or bulk data sets. In 2009, Gartner determined that bulk data transfers comprise around 80% of all traffic. This is probably a surprise if your company is among the many with millions invested in just managing individual transactions – but there are good management and security reasons for this continuing situation.

Why is File Transfer Still Common?

Financial institutions and item processors are still ‘FTP’ing’ (file transfer protocol), emailing, or sending and sharing files instead of transactions for a number of reasons. First, it helps hide the complexity of systems on both ends – there is no reliance and concern regarding libraries of transactions and responses related to one system and a different set related to another system. Second, it reduces the risk of transmission failure and makes it less risky for employees to send a small number of files or bulk data sets rather than a large number of transactions. Finally, it also increases the reliability of an overall operation.

The Managed File Transfer Industry

The managed file transfer (MFT) industry is comprised of providers whose solutions manage and protect these bulk data sets as they move between partners, business areas and locations. Collectively they address challenges presented by bulk data transfers and principles-based rules of the sort that have become common over the past few years – for example the Data Protection Principles or International Financial Reporting Standards (IFRS). Fundamentally, rules that tend to embody real-world outcomes as a standard. So, for example, the reported outcomes of penetration testing depend for certification as much upon the experience of the tester (who may be an employee) as upon the integrity of the network. This is all fine – until your network meets the real world. Principles-based rules tend to put the onus squarely on us to make and maintain systems.

For consumers, consultants and Payment Card Industry (PCI) assessors, this is undoubtedly ‘a good thing’. For those handling card data, the costs of validated and effective compliance represent a potentially significant burden that’s worth passing on to an industry that has quietly got on with the job well before buzzwords, such as ‘cloudsourcing’ or even ‘outsourcing’, entered the lexicon.

Vendors and Technologies Need Evaluation

It therefore makes a great deal of sense to place as much of that onus, and indeed risk and potential liability, on the shoulders of others – suppliers and consultants – as we can. Although PCI Data Security Standard (PCI DSS) can, and does, descend into tick-box detailed level rules in some places – which it makes very good sense to sign off to trusted third parties – nevertheless, significant ongoing parts of our obligations under PCI DSS are essentially management issues. Despite subjective components and PCI requirements to take ongoing account of best practices, the technologies themselves can still be evaluated on a relatively straightforward mechanistic basis, provided that they are submitted to sufficient scrutiny.

At the most basic level, subjective terms such as ‘adequate’ or ‘insecure’ are sometimes to be understood (explicitly or otherwise) as denoting specific technologies or other standards in line with industry best practice and are, therefore, a route to initially evaluating software on a tick-box basis.

Beyond Ticking Boxes – Four Initial Considerations

When evaluating for data security technology in the context of regulated activities, you should look at how four categories – confidentiality, integrity, availability, and auditing – contribute to security and compliance. These headline considerations are designed to assist in assessing whether a data technology or process is likely to provide one-time compliance for the purposes of PCI DSS.

Confidentiality ensures that information can be accessed only by authorised individuals and for approved purposes. For the purposes of PCI DSS this means that employees should have the minimum level of access necessary to do their job. Confidentiality begins with authentication of login credentials on every secure application and starts with putting a strong password policy in place, with robust account expiry procedures and password management.

Integrity, as repeatedly addressed in PCI DSS rules 10, 11 and 12, is relatively under-appreciated and understood solely as a security issue, but is a critical component to compliance. It means ensuring the uncompromised delivery of data, with full Secure Hash Algorithm (SHA)-512 support. In the case of file transfer operations, non-repudiation takes data security to the highest level currently available by adding digital certificate management to secure delivery and data encryption beyond the requirements of PCI DSS. The setting up of alerts is a relatively easy goal – a box ticked on the route to compliance.

Availability is not explicitly addressed in PCI standards but is a critical component of any overall security strategy. It can and should be addressed, if not guaranteed, through load balancing and clustering architectures that support automatic failover and centralised configuration data storage to minimise the chance of a data breach.

Auditing capabilities should be demonstrated by vendors in the form of comprehensive logging and log viewing with tamper evident measures to guarantee the integrity of log files. For technology, security, and other auditing purposes, all client/server interactions and administrative actions should be logged.

The Hitchhiker’s Guide to File Transfer in the PCI DSS Galaxy

The main body of the PCI DSS is divided into 12 requirements.PCI Logo

Section 1 establishes firewall and router configuration standards by requiring all managed file transfer (MFT) vendors to build a product architecture that puts a proxy, gateway or tiered application into a demilitarised zone (DMZ) network segment. This requirement also puts the actual storage of data and any workflows associated with it into internal networks.

The best architectural implementations ensure that no transfer connections are ever initiated from the DMZ network segment to the internal network. Typically this is accomplished using a pool of proprietary, internally established connections. In this way, clients can connect using FTP Secure (FTPS), Secure File Transfer Protocol (SFTP), etc to the DMZ-deployed device, but the transfers involving internal resources are handled between DMZ- and internally-deployed vendor devices by the proprietary protocol.

Section 2 demands that no default or backdoor passwords remain on the system and that systems are hardened. These best practices are generally enforceable with MFT technology, but the best implementations include a hardening utility that also extends protection to the operating system on which the MFT software runs.

Section 3, particularly subsection 3.4, covers encryption of data and storage of keys. To address these issues MFT vendors have an array of synchronous and asynchronous encryption technologies, such as OpenPGP, to ensure data is secured at rest. Cryptography is almost always performed using Federal Information Processing Standards (FIPS)-validated modules and secure overwrite of data is commonly used.

Section 4 covers encryption of data in motion. All MFT vendors currently support multiple open technologies such as Secure Socket Layer (SSL), Secure Shell (SSH) and Secure/Multipurpose Internet Mail Extensions (SMIME) in multiple open protocols, including SFTP, FTPS and Applicability Statement 2 (AS2), to provide this protection.

Section 5 ensures anti-virus (AV) protection is in place for systems and the data that passes through them. Most MFT vendors provide the ability to provide both types of protection with their software. The best allow integration with existing AV implementations and security event and incident management (SEIM) infrastructure.

Section 6 requires secure systems and applications. Most MFT vendors conform to the guidelines here, particularly subsection 6.5 on web application security. However, there are large variations on fidelity to subsection 6.6 in the industry. The best vendors use a battery of security assessment and penetration tools, such as HP WebInspect and protocol fuzzers, to ensure that their software exceeds PCI security requirements – and remains that way from release to release. The best vendors also have multiple security experts working with developers to ensure new features are secure by design. These attributes are not always easy to find on a vendor’s website, but they are critical to the long-term viability of an MFT application – be sure to ask.

Sections 7 and 8 cover the establishment of identity and authority. MFT solutions typically have built-in features that cover these issues from multifactor authentication to sharing of accounts. However, there are two common areas of difference between MFT vendors in these sections. The first is the ability to rapidly ‘de-provision’ users (i.e. disable or delete the account upon termination). The second is the proper storage of passwords: some vendors still use unkeyed hashes or weak Message-Digest algorithm 5 (MD5) hashes, both of which are susceptible to either rainbow table or collision attacks.

Section 9 is about physical access and is one that many software vendors erroneously ignore. However, subsection 9.5 is about off-site backups and is a function that MFT software often provides. One advantage of using an MFT solution for this purpose is that all the security benefits from the MFT solution flow into the backup process as well.

Section 10 is about auditing and visibility into data. MFT vendors also typically have a strong story around these attributes. Common features of MFT include visibility into the full ‘life cycle’ of files, aggregate reporting, detailed logging of every administrative action, and enforcement of specific service level agreements (SLAs). Some MFT solutions also ensure that audit logs and transfer integrity information are tamper-evident to ensure complete non-repudiation of data delivery.

Section 11 is about regular testing of systems and processes. As mentioned above, MFT vendors who perform these types of tests on their own solutions before releasing their software to the public should be sought out and preferred by companies that must adhere to PCI DSS.

Section 12 is about maintaining and enforcing a security policy down to the level of end user training. Like section 9, section 12 is another section many software providers erroneously ignore. However, the best MFT vendors know that providing fingertip reporting and good user experience to both administrators and end users can go a long way toward encouraging proper use of technology.

PCI DSS Appendices A (‘Additional PCI DSS Requirements for Shared Hosting Providers’) and E (‘Attestation of Compliance – Service Providers’) are also often used when managed file transfer services through virtual area network (VAN), software-as-a-service (SaaS), hosted or cloud providers are used. Key requirements here include ensuring that the service provider is not allowing shared users, that different organisations can only see their own logs and that the provider has policies that provide for a timely forensics investigation in the event of a compromise.

Summary

The substance of the PCI burden is an ongoing one. To look down the list of PCI requirements is to scan a list of enjoinders to ‘maintain’, ‘monitor’ and ‘ensure’, that echo the ‘manage, monitor and secure’ objectives of basic FTP technology. However, and, as the March 2008 Hannaford data breach shows, it is possible to be ostensibly compliant – to have ticked all the boxes – and yet not be fully secure.

PCI DSS compliance requires organisations to protect the security, privacy, and confidentiality of information – and to document who accesses the information and the security measures taken to prevent theft, loss, or accidental disclosure.

Click here for further information on the range of products by Ipswitch File Transfer or call Pro2col Sales on 0333 123 1240.

Share on TwitterShare on FacebookShare on LinkedIn+1
 

When is FTP better than Managed File Transfer?

So why FTP File Transfer and what’s so great about it?  Well to be honest this isn’t necessarily a blog to evangelise FTP but more the way in which it works, lets call it ’sending files’.  With many businesses looking to adopt Managed File Transfer solutions, I thought it might be worth redressing the balance and putting things into perspective.  Managed File Transfer solutions have many good features but in the case of email based ones, sending files isn’t one of them.  In many cases the Managed File Transfer solution doesn’t actually send anything, rather it asks the company email server to send an email to a particular recipient.  The person receiving the email clicks on a link within the email to download the file or goes to a web site to log-in and manually download the file – so you see the responsibility is on the recipient to download the file and given this, there is no guarantee that the file will get there.  In fact there’s no guarantee the email is going to get there at all, asking the recipient to download the file(s).  Whilst Managed File Transfer solutions cater for the majority of ‘file transfer’ uses it is certainly not the right solution for every scenario.

FTP

So what do I mean by ’sending files’.   Well, historically the majority of solutions used to send files required a connection to be created between two sites and the files to be pushed/transferred to the receiving site using the appropriate delivery protocol for the connection method, e.g. Modem, ISDN or IP.   A typical example that many people would be able to relate to is FTP.  A user with an FTP client enters the details for the server, connects, selects the files to transfer, drags them over to the ‘remote server’ window (in many FTP client softwares) and the transferring of files starts straight away.  Once all of the files have been transferred you can see them on the remote server, they are there without question, the files have been delivered.

In contrast, Managed File Transfer solutions that use email messaging to deliver a message to request the download of the files, has several potential points of failure.  You’ve got to rely on two email servers to be happy to deliver the message and not overburdened with other requests, you have to ensure that SPAM filters don’t whisk away your all important message and probably most importantly – someone has to be there to open, read and perform the manual process of downloading the file.

In short FTP file transfer has a place in the enterprise.  If you want to be able to push data to a location with or without manual intervention, then FTP or another file transfer protocol with similar features will do.  Certain business to business situations will rely on data being sent from one location to the next e.g. a publisher to his printer, where time is of the essence and any doubt about the delivery of the data has to be avoided.

Finally it is possible to make FTP more functional and secure than many Managed File Transfer vendors make out, in fact some Managed File Transfer vendors have it built in.  Depending upon the solution you implement, you can get some great functionality to compliment this old delivery protocol and its also possible to integrate with workflow solutions, script integration and utilise API’s and SDK’s for complete integration.

Share on TwitterShare on FacebookShare on LinkedIn+1
 
© Pro2col Ltd 2012 | Terms of Sale | Privacy Policy | Sitemap
Part of the Pro2col Group