Remove duplicates from a linked list in single traversal

Given an unsorted linked list, write a function which deletes any duplicate nodes from the list by traversing the list only once.


 

For example,


Input:  5 -> 3 -> 4 -> 2 -> 5 -> 4 -> 1 -> 3 -> null
Output: 5 -> 3 -> 4 -> 2 -> 1 -> null

 
Simple solution would be to consider every distinct pair of nodes in the list and check if they have same data or not. If their data matches we delete the latter node. The time complexity of this solution is O(n2).

 
We can perform better by using hashing. The idea is to traverse the given list and insert each encountered node into a set. Now, if the current node already present in the set (i.e. it is seen before), we delete it and move to next element. At the end, we will have all duplicated removed from the list.

C++

Download   Run Code

Output:

5 -> 3 -> 4 -> 2 -> 1 -> null

Java

Download   Run Code

Output:

5 -> 3 -> 4 -> 2 -> 1 -> null


 

The time complexity of above solution is O(n) and auxiliary space used by the program is O(n).

 
Similar post: Remove duplicates from a sorted linked list

 
1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)

Loading...

Thanks for reading.

Please use our online compiler to post code in comments. To contribute, get in touch with us.
Like us? Please spread the word and help us grow. Happy coding 🙂
 



Leave a Reply

avatar
  Subscribe  
newest oldest most voted
Notify of
speedplane
Guest

I’m not going to take the bait, but I feel like the above algorithm can be written in five lines of python, which has built in hashed sets.

Ren
Guest

Look I code most of my code in C and don’t really have experience with C++ and the built-in hash tables,. But it seems silly to me that building a pseudo-copy of the original list follows the tenet of ‘in a single traversal’.

The only real way to do this is by getting a guaranteed sorted list from the start (or sorting it yourself, but that’s again not a single traversal).

I get that hash maps would be a good way to speed up comparisons; the task description just seems wrong to me though.