Wednesday, December 31, 2008

Ajax application Security

Ajax is not issue about application security but application programming model dose make application vulnerability more porous as per software engineering.JavaScript-powered client-server interactions do enlarge the attack surface.

Ajax application security issue can be address by take care in desiging of application architechture.Here I will going to introduce that thing which can resolve the Ajax application security problem.

SQL Injection : In these attacks, hackers first research common SQL error messages to find vulnerable pages and then modify Select statements to, for example, use a simple TextBox to gain access to a database. Ajax complicates matters because it makes it possible to write SQL expressions on the client side.

Tips to prevent this kind of attacke are:
  1. Use CustomErrors pages in the WebConfig file to prevent attackers from identifying an application's particular vulnerability.
  2. Use Stored procedures or parameterized SQL queries instead of dynamically created SQL queries.
  3. Perform input validation on the server side, not through JavaScript.
  4. Use the Least Privileges account for your database and do not allow access to system data. This builds on the notion that security should be implemented in single layers, Software Engineering stated: "You don't want them to be able to thwart one and then get to the data."

Information Leakage : If the JavaScript APIs that power an Ajax application are not properly secured, hackers can use application workflow data exposed on the client side to piece together server-side services. The best way to protect against this, not surprisingly, is to keep security validation on the server side. The only validation that should occur on the client side is that which defines the user experience

Cross site Scripting : In these attacks, hackers foist malicious JavaScript onto unsuspecting users. This tends to happen on Web sites featuring a simple TextBox and a button click that encapsulates text. Instead of, say, posting a comment in a forum, hackers will use this TextBox to put in a script tag to transfer large sums of money from your bank account to theirs. Ajax, as you might expect, leaves more APIs open than does a traditional Web application.

To Protect against Cross-site-scripting I would urge you to do your own validation to make sure you're not allowing this type of input." To best accomplish this, he recommended the use of a white list, which specifically states only the characters that a user is allowed to type in the TextBox. Make sure this list does not include script tags or HTML code.

Cross-Site Request Forging: These attacks use malicious image tags in emails and leverage browser cookies. The image acts as a placeholder for what is really a query string to make that aforementioned money transfer. Once that page loads, the image request triggers an HTTP GET action, and cookies are passed along with it. "The variables coming in from the query string look exactly the same as a post. It's using that cookie that's stored on your computer, and your information, to make that query work,"
Protecting against
cross-site request forging involves three best practices, he continued. The first is to use HTTP POST data as opposed to HTTP GET data; the latter can be used for retrieving data, but it should not be used for performing any sort of action using that data. The second is to use one-time, per-token requests. The third is to stand up to nagging end users and stop using persistent cookies for authentication -- especially if sensitive data sits behind a log-in screen.
JavaScript Hijacking: This variation of cross-site request forging, which thanks to ASP.NET and IIS authentication does not occur in Internet Explorer, sets script tags to a particular URL that, when HTTP GET is passed, will return a
JSON-formatted string. From there, the hacker modifies the object prototype to peer into JSON values when they are created. In addition to using the HTTP POST protocol, Lombardo said the best way to protect against JavaScript hijacking is to encode JSON strings on the server side, not the client side.
Lombardo offered two tidbits of advice that were not covered in his discussions of the five common Ajax security vulnerabilities.
First, he recommended removing the
WSDL from Web services, as this only gives hackers information about an application that they otherwise would not be able to determine.

Second, he said it is a good idea to place WebMethods and WebServices in separate classes.

Thanks & Regards

Abhishek Hingu

Sr. Software Eng.

Indianic Infotech Pvt Ltd

Tuesday, December 9, 2008

Window Azure - Code name - Astoria

I am waiting from long time and inssite that Microsoft release ADO.NET Data Service in Window Azure platform. but in last week microsoft announce Code name called Aestoria which end my waiting

The first version of the ADO.NET Data Services Framework (a.k.a. Project "Astoria") introduced a way of creating and consuming flexible, data-centric REST services. In this incubation project, we are now working on creating an end-to-end story for taking data services offline using synchronization. By integrating data services with the Microsoft Sync Framework, we will enable developers to create offline-capable applications that have a local replica of their data, synchronize that replica with an online data service when a network connection becomes available, and use replicas with the ADO.NET Entity Framework for regular data access.

Whats is Astoria?
The ADO.NET Data Services framework provides a common set of conventions for exposing, accessing and manipulating data across data-centric services using internet friendly protocols and message formats.
As online, data-centric, services have becoming increasingly prevelant, so have the scenarios within which we wish to make use of these services and the data that they provide. Consumers are no longer satisfied being able to only access their data when connected to the internet. Service providers would like to provide the option for consumers to be able to synchronize with the data they expose, for example to enable offline access when the user is temporarily disconnected from the internet.
Astoria Offline strives to enable developers to create offline-capable applications that have a local replica of their data, synchronize that replica with an online data service when a network connection becomes available, and use those replicas with the ADO.NET Entity Framework for regular data access.

Monday, December 1, 2008

Cloud Computing - Window Azure Platform

Every body is talking about Azure - Microsoft's Operating System for the cloud. How can i not write something about it? After all i was there during the unveiling of this paradigm changing strategy was done by Ray Ozzie at PDC. Read the transcript of Ray Ozzie’s keynote or watch the video to know more.
So what is all this about? Azure, Cloud OS, .NET Services, Live Services and tons of other stuff? Let me try and paint the picture.
Let's first look into what is an Operating System (OS) all about? If anyone who is reading this post is old enough to remember the days when you have microprocessor devices without any operating system on them you will understand how the OSes have changed the world of computing - how they have made computing so much more accessible and useable by the common man. Let’s imagine what would a device without an operating system be like? Suppose you had to just copy a file from one location to the other. You would potentially have to do one or more of the following things. write a program in the instruction code of that particular processor that would instruct it to first 'mount' the source drive (assuming that the source hardware is plugged and understandable by the processor), 'mount' the target drive, tell the processor where is the beginning of the source file (location of the sector, block etc) and tell it the location where it should begin writing on the target, tell it the memory location where it will keep the read contents temporarily and its size, and then issue the machine instruction equivalent to writing the bits on the drive. It could be fairly more complex than this. Today we hardly recognize the complexity of all this when we issue the "copy" command or "drag and drop" the file from an usb drive to the hard drive. Why? Because the OS abstracts all this complexity for you! Disk Operating Systems were an important phase in the evolution of OSes - meant to abstract the complexity of creating, copying, deleting, managing files and disk input outputs in general they soon evolved into more complex and capable systems - managing other peripherals, memory, applications, drivers, etc for you and giving you an easy to use interface where in you can focus on working on your problems (be it writing code, word processing, data analysis) rather than having to deal with the internals of the machine or the processor. OSes laid the foundation of the rapid evolution of computing allowing a common person to use computers and get benefit out of it.
This is something that an OS running on our personal computers does. The common tasks that we expect a PC OS to perform include - managing memory, disks, peripherals, display, user interface, runtimes of software etc... Lets now take a step back and imagine if you were to build an OS that would run on a remote machine, which would be accessible to anyone in the world, allowing you to put your applications on it, run them, make it available to anyone in the world - what would be functions that would be expected from such an OS? What modules would it need to take care of all the possibilities and the scenarios that arise in the cloud so that is minimizes the complexity for the end user in the same way that the PC OSes have done? To bring the problem closer to a real scenario - lets add one more level of complexity - since it is not possible to have one big computer with enough computing resources and memory to address the needs of everybody in this world - it is obvious that the remote machine would actually be a complex cluster of thousands of machines/servers, raid drives and other resources. Now this OS that you are embarking to build would in effect be a sort of a mega-OS, an OS to manage thousands of OSes running on these thousands of machines (also in a virtualized environment). So what would be expected out of this mega-OS? Apart from all of the functionality that is provided by a common OS - it would need to make sure that if there is a failure of any of the OSes running on any of the machines, or if any of the machines goes down, or if the resources on any of the machine is falling short of requirement, or if new machines/OSes are added or pretty much anything to manage the possibilities in this huge cluster of machines that would almost be like a black box to the end user. To add to it - managing multiple users, their identity, their data, transactions, maintaining throughput etc...
This is exactly what Microsoft's OS in the cloud - Azure is all about. Azure is an Operating System that runs on the a huge cluster of servers located in multiple places across the world and is exposed to the users by way of various services (.NET Services, Live Services, SQL Services, Sharepoint Services, Dynamic CRM services) - such that they no longer have to worry about hardware, memory, load, electricity bills of data centers, maintenance, replacement, scaling the hardware and resources as their business grows etc. All they have to do is build their applications for Azure (almost in the same way as they did for Windows running on their customer's box with the additional task of keeping in mind the best practices and patterns for building effective and efficient cloud applications) and host them on the Microsoft data center running on Azure. Azure in the background will take care of ALL the underlying complexities of deploying it on one or more machines, load balancing, recovery form failure, guaranteed availability and other issues that are relevant to the cloud making your application immediately available to millions of Internet users.
Remember the impact that .NET had on the way we develop applications for the Internet? Imagine writing a webservice in native code, deploying it and testing it. Compare it to doing the same thing in .NET. I believe Azure is going to bring the same kind of revolution in the world of computing. The ease with which you can develop an application using traditional programming knowledge and tools like Visual Studio and host them in the cloud running on Azure and forget about the rest is simply phenomenal and anybody who has an eye on the future cannot ignore this.
This was a post intended as a primer into Azure. In the subsequent posts I plan to delve deeper into architecture of Azure, tools for Azure development and building your first Azure application. Keep a watch…and if I do not post soon…leave messages to force me to do so..:)

Microsoft Azure .NET & SQL Service Development at:
http://www.indianic.com/window-azure-development.html